5 Common Mistakes in Product and Market Development Reviews

Last post I shared some best practices to put customers at the center of your product review process. I've also made my fair share of mistakes throughout the years doing what doesn't work well when creating reviews. In the spirit of sharing with others so they can learn from my mistakes, here are 5 common mistakes I've made, and learned to stop doing for better results.

You have a product in development and you have your list of vetted targets ready to go. This is a critical time in the product development process; a chance to see how your potential customers are responding to what you are creating and an opportunity to provide a great experience to those giving feedback. Review questionnaires are just one small item in your market/product development "toolbox" but the results can have a significant impact on how you continue to develop your product. 

Avoid these 5 common mistakes:

  • Asking for more than you need answered. I always made a habit of reviewing product review questionnaires for my team. When I would find myself getting bored clicking through all of the questions I knew there was a problem in our setup. Always ask yourself - how will I use the information I receive from this question? Do I really need to know this? Is this the best way to ask this question? How else might I get this information? When I have posed these questions to my team we've been able to heavily edit down to what we really needed to know. It doesn't only make your questions more targeted to what will help drive product development, but it also makes for a streamlined, better experience for your customers.  Ask only what questions you need answered.

  • Taking too long to get to the ''meaty'' questions.  Reviewers have an attention span limit. I've seen hundreds of surveys/reviews and the first few pages have included question after question about the user's background, expertise, course goals, product in use, information about what's happening at the school, within the committee, within their administration, at their state level, etc.  By the time a user would get to the meaty questions - the ones the product team really needed answered - they would have spent half an hour on these other pieces. I'm not discounting this information; it's necessary. In some cases it may be the information that you really need. Put up front the questions where you want the reviewer to focus the majority of his or her time. That's when they're at their freshest.

  • Not mixing it up enough. We all have our standard review type of questions/surveys we are used to using. They're our go-to. They were probably passed down from colleagues or managers as examples of best practices to use when sending out reviews. Blow this up on occasion. Consider what kind of review is the most appropriate for where you are in the process, and the information you need.  Don't hesitate to try something completely new. One review whose responses still stand out for me was intentionally extremely minimalistic. I asked reviewers to do competitive comparisons of digital products. I didn't provide any review questions nor specifics about information I needed, just the products I wanted them to compare. The review was simple; tell me what you think the product team needs to know and put what's most important to you in order. There were a lot of follow-up questions along the lines of  "Can you just tell me what you want?" (new experience for them too), but I pushed back because how they chose to attack the review would give more insight than their responses to specific questions I would have generated. I wasn't able to do this with 200 reviewers, and it wasn't a common approach I took for most reviews, but the responses that were returned gave valuable information about what my potential users thought I most needed to know. Here my questions didn't guide their responses. Blowing it up gave me much greater insight into my customers.

  • Allowing "undecided" to be an option. At the end of many review questionnaires there is a question similar to this:

     "How likely are you to use this product?"
    • Very likely
    • Likely
    • Undecided
    • Unlikely
    • Very unlikely  

       Remove undecided. It's an easy out and doesn't tell you anything. Make your customers decide        if they are on the likely or unlikely scale. Always know which way they are leaning.

  • Relying only on what you can quantify. With the question above, there is a strong tendency to want to quantify how many customers are "very likely" or "likely" to use your product each time you review; it gives confidence that your product development is going in the right direction and it is a nice metric to be able to share with the executive and finance teams to warrant continued investment. But I would argue that no real insights come from this question. The insight comes from hearing from a potential customer - in his or her own words - as often as you can - what matters most to them and how your product is stacking up to those needs.

Product review questionnaires are an important touch point with your market and are most impactful when you have the right people involved, at the right time, and have strategic and consistent follow-up planned. Avoid these pitfalls in your set-up to get the most out of this important activity.

Are your reviews garnering the feedback you need to drive the development of products your markets want to use?  Contact us. We can help.