Category Archives: Testing

Don’t Engineer Anything You Can’t Sell

In any startup resources are constrained. In a young product company and especially one that will be dependent on using others (the channel or distribution) to sell it’s product then these constraints are often most prominent in engineering and marketing.

If you have a great team working at the company then frequently team members will come up with new ideas about a feature to add or product to develop. Some of these ideas might be simple and others more complex but all even at a theoretical level will have an impact on the company. As time and attention is probably the most limited resource in any organization it’s good to make fast decisions on whether to implement or not to implement these ideas.

At Data Robotics we had a very simple filter to determine if it was worth considering moving an idea forward :

What would the marketing copy look like for the idea?

There are two pieces of marketing copy to consider. The first is one line feature statement which could resonate with time limited customers such as a channel sales person who might be representing your product. The second piece of copy (especially important if your business is not channel focused) is a brief paragraph on the feature. No need to write it down, just state it out loud. This copy doesn’t have to speak to the low level details of the feature. For example if it helps performance and being a high performer is something your customer cares about then that’s your copy. High level is fine. Without the ability to represent a new feature or product simply, clearly and concisely you wont be able to drive any interest in your offering and thus it’ll have little to no impact on company revenue.

In my experience it turns out to be surprisingly hard for the majority of proposed ideas to be represented this way and most get dropped early on. As a result at Data Robotics we saved a lot of time which otherwise would have been spent on discussions about what was feasible to engineer and what customers might like to have. It helped our focus and meant that the company spent it’s time working on what would most contribute to its growth. Give it a try and maybe you’ll also find it useful.

Don’t Ask, Test

It’s probably controversial to say but I’m not a huge fan of asking customers what products you should build to meet their needs. Truly disruptive products are based on the correlation of a multitude of points of customer pain and market opportunity and putting them all together is a big task. That’s your job not the customers.

This kind of thinking is generally contrary to the behavior inside of product companies which often look to product marketing for feedback from customers in terms of what should be built next. An analogy I use to show how this often works out is: If you went back 100 years and asked a traveller who made frequent transatlantic trips how to improve those trips you’d get answers like “A Better Stateroom”, “Faster Boat” and so forth. Only a lunatic (read visionary or entrepreneur here) would have suggested “Flying Across the Ocean”.

Apple is the canonical example here of course. Steve Jobs has said publicly many times that they design the products they’d like to use rather than designing based on customer feedback. Contrast this with Microsoft’s “I designed Windows 7 campaign” where they claim (I suspect somewhat erroneously) that their customers feedback determined all of the key features in their product.

However, I am strongly in favor of asking customers questions that enable you to determine the correct product to build. Whilst most people would agree with this, it turns out that asking the right questions is an extremely skilled task. Almost everything I learned on this subject I learned from a man named Mark Fuccio.

I’ve had the good fortune of working with Mark for over 10 years now. We first met when I hired his firm (Tactics) in the early days of BlueArc. Prior to BlueArc I’d been building data centers for a living and had generally been making good money displacing UNIX systems and replacing them with the simpler, more modern and much less expensive Windows NT systems. This had lead to several biases on my part. When Mark was brought in to analyze BlueArc’s initial market opportunity we were only engineering Windows protocols (SMB/CIFS) into the first Silicon Server and ignoring UNIX entirely (crazy in hindsight). Mark knew it would be extremely unpopular to tell the team that we needed to engineer the UNIX file serving protocol NFS into the product but after performing customer testing he did just that. Mark has never been afraid to deliver the bad news very directly. There’s an old adage that a consultant is a person who borrows your own watch to tell you what the time is. There’s also a lesser known and often more accurate adage (unsurprisingly told to me by a consultant) that a consultant is the guy who pulls the watch out of your a*s to show you what the time is. On that day Mark removed the watch for me and without that vital feedback BlueArc probably wouldn’t be in existence today.

Mark was one of my first hires at Data Robotics and his analysis lead to the accurate targeting of our initial markets and many of the attributes of our product when launched. He’s a master at creating online studies that walk the questionee through a series of paths based on their earlier feedback. These complex studies also often are self reenforcing allowing the results to be quickly validated to determine respondents who clicked aimlessly vs. those who applied some thought to the questions. 

The most important thing I learned from Mark over the years is –

The Way You Ask Questions is Critical

One illustrative example that sticks out clearly in my mind was in regards to the initial pricing for Data Robotics first product, the Drobo, which we could supply with a range of different storage capacities based on the number and size of hard disk drives we included. If we just asked customers which price was best for each capacity then they’d clearly select the lowest price. A common tactic in studies is to ask customers to rate a price range for a product as cheap, about right or expensive but even this is limited. In the study Mark crafted he first asked customers how much storage they wanted in the product and then determined price ranges based on their answer whilst keeping them isolated from the alternative choices and prices. I was amazed by the results….

Drobo Initial Capacity Expected Price
250GB $125
350GB $200
500GB $400
No Disks $520

amounts are illustrative only and not the actual numbers from the testing

Our testing showed clearly that customers were willing to pay more for an empty Drobo than one containing storage, even though the empty Drobo was of course much cheaper and simpler for us to build. This was an example of a result that you never would have guessed but was very explicable once you could see the data. Prior to Data Robotics all sub $1000 storage systems contained storage and were priced at a very small premium to that storage. Thus a 500GB system was priced more or less at the same level as a 500GB hard drive. The industry had trained customers to think only about the price of the storage and the product’s actual feature set (which was more or less identical in every other system) was ignored. Once initial capacity was removed from the question about the price of a Drobo then the customer taking the survey focused instead on the revolutionary features of the product and priced based on that value instead.

This data ended up driving our whole strategy to launch Drobo as a diskless system and allowed us to set a price based on it’s value, rather than the alternative of having to compete in a low margin market priced on the storage in the system as did our competitors (who were mostly disk drive vendors and thus at an advantage in this area). We also did some price banding at launch to further test this assumption but I’ll discuss that in a later post.

In summary –

  • Don’t ask your customer what product you should build but instead survey them about their pain, challenges and opportunities, then design and suggest varying solutions to determine their receptivity to them.
  • Think carefully about every product decision. It is driven by data or internal assumptions?
  • Be thoughtful in how you ask questions and test assumptions. Even the slightest context can bias the results but correctly performed testing can be very valuable indeed.