Don’t Ask, Test

It’s probably controversial to say but I’m not a huge fan of asking customers what products you should build to meet their needs. Truly disruptive products are based on the correlation of a multitude of points of customer pain and market opportunity and putting them all together is a big task. That’s your job not the customers.

This kind of thinking is generally contrary to the behavior inside of product companies which often look to product marketing for feedback from customers in terms of what should be built next. An analogy I use to show how this often works out is: If you went back 100 years and asked a traveller who made frequent transatlantic trips how to improve those trips you’d get answers like “A Better Stateroom”, “Faster Boat” and so forth. Only a lunatic (read visionary or entrepreneur here) would have suggested “Flying Across the Ocean”.

Apple is the canonical example here of course. Steve Jobs has said publicly many times that they design the products they’d like to use rather than designing based on customer feedback. Contrast this with Microsoft’s “I designed Windows 7 campaign” where they claim (I suspect somewhat erroneously) that their customers feedback determined all of the key features in their product.

However, I am strongly in favor of asking customers questions that enable you to determine the correct product to build. Whilst most people would agree with this, it turns out that asking the right questions is an extremely skilled task. Almost everything I learned on this subject I learned from a man named Mark Fuccio.

I’ve had the good fortune of working with Mark for over 10 years now. We first met when I hired his firm (Tactics) in the early days of BlueArc. Prior to BlueArc I’d been building data centers for a living and had generally been making good money displacing UNIX systems and replacing them with the simpler, more modern and much less expensive Windows NT systems. This had lead to several biases on my part. When Mark was brought in to analyze BlueArc’s initial market opportunity we were only engineering Windows protocols (SMB/CIFS) into the first Silicon Server and ignoring UNIX entirely (crazy in hindsight). Mark knew it would be extremely unpopular to tell the team that we needed to engineer the UNIX file serving protocol NFS into the product but after performing customer testing he did just that. Mark has never been afraid to deliver the bad news very directly. There’s an old adage that a consultant is a person who borrows your own watch to tell you what the time is. There’s also a lesser known and often more accurate adage (unsurprisingly told to me by a consultant) that a consultant is the guy who pulls the watch out of your a*s to show you what the time is. On that day Mark removed the watch for me and without that vital feedback BlueArc probably wouldn’t be in existence today.

Mark was one of my first hires at Data Robotics and his analysis lead to the accurate targeting of our initial markets and many of the attributes of our product when launched. He’s a master at creating online studies that walk the questionee through a series of paths based on their earlier feedback. These complex studies also often are self reenforcing allowing the results to be quickly validated to determine respondents who clicked aimlessly vs. those who applied some thought to the questions. 

The most important thing I learned from Mark over the years is –

The Way You Ask Questions is Critical

One illustrative example that sticks out clearly in my mind was in regards to the initial pricing for Data Robotics first product, the Drobo, which we could supply with a range of different storage capacities based on the number and size of hard disk drives we included. If we just asked customers which price was best for each capacity then they’d clearly select the lowest price. A common tactic in studies is to ask customers to rate a price range for a product as cheap, about right or expensive but even this is limited. In the study Mark crafted he first asked customers how much storage they wanted in the product and then determined price ranges based on their answer whilst keeping them isolated from the alternative choices and prices. I was amazed by the results….

Drobo Initial Capacity Expected Price
250GB $125
350GB $200
500GB $400
No Disks $520

amounts are illustrative only and not the actual numbers from the testing

Our testing showed clearly that customers were willing to pay more for an empty Drobo than one containing storage, even though the empty Drobo was of course much cheaper and simpler for us to build. This was an example of a result that you never would have guessed but was very explicable once you could see the data. Prior to Data Robotics all sub $1000 storage systems contained storage and were priced at a very small premium to that storage. Thus a 500GB system was priced more or less at the same level as a 500GB hard drive. The industry had trained customers to think only about the price of the storage and the product’s actual feature set (which was more or less identical in every other system) was ignored. Once initial capacity was removed from the question about the price of a Drobo then the customer taking the survey focused instead on the revolutionary features of the product and priced based on that value instead.

This data ended up driving our whole strategy to launch Drobo as a diskless system and allowed us to set a price based on it’s value, rather than the alternative of having to compete in a low margin market priced on the storage in the system as did our competitors (who were mostly disk drive vendors and thus at an advantage in this area). We also did some price banding at launch to further test this assumption but I’ll discuss that in a later post.

In summary –

  • Don’t ask your customer what product you should build but instead survey them about their pain, challenges and opportunities, then design and suggest varying solutions to determine their receptivity to them.
  • Think carefully about every product decision. It is driven by data or internal assumptions?
  • Be thoughtful in how you ask questions and test assumptions. Even the slightest context can bias the results but correctly performed testing can be very valuable indeed.

2 thoughts on “Don’t Ask, Test”

  1. I’m fascinated by these aspects of marketing and surveys…

    One of my favorite books is Predictably Irrational by Dan Ariely (a speaker at TED). In this book he talks about all sorts of experiments he performed at MIT and other schools and much of it relates to marketing tactics. It’s amazing how much of our seemingly free decisions are being influenced without realizing it.

    I completely agree with the premise that customers don’t really know what they want, in general they know what improvements they’d like to see in existing products but rarely think completely outside the box. Engineers have a hard time with that as well though.

    1. Thanks for the suggestion Richard, I’ll be sure to check out that book.

      Due to the nature of the issues you mention companies often get founded on a great idea but then future cycles of product releases are often just implementations of lists of requested features rather than an evolution or reinvention of the main product. The really great product companies are the ones that know they need to reinvent their own cash cows in order to continue to maintain or grow their market.

Leave a Reply

Your email address will not be published. Required fields are marked *