Consumer Research: Poor research approaches give poor answers

1
41

Share on LinkedIn

This post from the Harvard Business Review’s Daily Stat on 15 February 2010 shows a surprising lack of insight into how consumers actually respond to research questionnaires.  It reports a McKinsey survey where consumers were asked what they were more interested in: core benefits or bells and whistles.



I don’t believe the results.

The issue here is the way the survey has been undertaken.  In this case, asking customers what they want is unlikely to give you an accurate answer to your question. Not because they will not tell you the truth but because they are unlikely to act in the same way, in practice, that they say they will act, in theory.

If you really want to know what consumers want then, in this case, it’s best to look at what they actually do.  On this issue (core benefits or bells and whistles) it would be a relatively easy piece of analysis to do: compare sales of the upmarket “bells and whistles” models with the downmarket “core benefits” models.  That way you can see what customers actually do rather than what they say they do.

Poor research design like this is a common problem.

For instance, we’ve seen loyalty program research that, when asking customers if they want points or discounts, found they overwhelmingly want discounts. Then those same customers go on to collect points, respond to points promotions, etc.

And there was the set of supermarket research that asked customers if they would buy “store brands”.  When you compared actual purchase data with customer responses, many customers with baskets full of the store brand product said they would never lower their standards that low.

If you naively believed what customers told you in either of these circumstances you could easily build the completely wrong customer experience.

Consumer research is fraught with this kind of problem and when designing your research approach you need to act defensively to guard against getting incorrect or inaccurate results.  If you don’t, you will get a pretty chart that looks nice in the report but the insights in it will be just plain wrong.

Another example of this type of not believing what customers say is the “how important is feature x” type of question.  Understanding how important a service feature is to the customer is critical in designing good service processes.  So questionnaires are often used to determine which features are most important.

The problem comes if the surveys are poorly implemented.  You’ve probably seen the bad ones; they look like this (paraphrasing):

Q1: How good is our price?

Q2: How important is price?

Q3: How good is our service responsiveness?

Q4: How important is responsiveness?

Q5: How good is our feature x?

Q6: How important is feature x?

Even before I see the results I know what the answers will be.  Everything is important: 9s or 10s out of 10.  So what do you know now?  Nothing more than you knew before because everything has the same high importance.  You’re back to square 1.

Even if the results are not all 9s and 10s they will be skewed by what customers want you to think.  For instance, no customer is going to tell you that price is unimportant, lest you decide to put up their price.

There are at least two better approaches than this:

  1. Some type of forced ranking: In this approach you force the customer to rank the importance of each feature using a points approach, ranking or a best-worst approach.
  2. Infer importance: If you design the survey in the right way you can infer what is important based on the answers you get from other questions or actual customer behaviour.  This takes a bit of additional analysis but is well worth the extra effort.

Both of these alternatives will deliver a much more accurate outcome.

So, when you next look at the survey questionnaire your agency has provided, act defensively, and think about whether the answers you get will be accurate.

Have you seen other poor research approaches?  Leave a comment and let me know.

Republished with author's permission from original post.

1 COMMENT

  1. Adam, thanks for an insightful post.

    The proliferation of easy and cheap survey tools makes it simple to capture customer feedback. Whether any real insight is gained is another story.

ADD YOUR COMMENT

Please use comments to add value to the discussion. Maximum one link to an educational blog post or article. We will NOT PUBLISH brief comments like "good post," comments that mainly promote links, or comments with links to companies, products, or services.

Please enter your comment!
Please enter your name here