Dangers of Dumbing Down Customer Research


Share on LinkedIn

Just about every time I buy something from a major retailer or stay at a leading hotel these days, I am being asked to tell them how they did in serving me. They are all using “cash register tape” research—the modern-day equivalent of what I used to refer to as “seat pocket” research.

You remember; years ago when you would get on an airplane, there’d be a questionnaire in the seat pocket. Part way into the flight, the flight attendant would ask you to complete the survey and she would collect it just before landing. Among other problems, it seemed to me they were never able to capture feedback on how the pilot handled the landing or whether your bag showed up intact at the baggage carousel or was on its way to Miami.

These days, many companies are using a similar approach that involves inviting you to visit their website to complete a survey. In the past week, partly because I wanted to see what kind of survey was being administered, I have completed three of these.

Online customer feedback is now phenomenally popular because it’s possible, it’s easy and it’s cheap.

The Home Depot offered me a chance to win a $1000 gift card if I would simply go to their website, key in the store number and invoice number and complete the survey. Westin asked for my rating of my most recent stay through an e-mail survey. While they offered me no reward, my visit had been sufficiently catastrophic that I gladly provided my feedback. Starbucks gave me a number once I had completed their survey. I simply took that number to my local Starbucks and claimed a free grande latte. In at least one of the surveys, I also provided my postal code—so now the company knows where I live.

Why are so many firms making use of this approach to research and how good is it anyway? Quite simply, online customer feedback is now phenomenally popular because it’s possible, it’s easy and it’s cheap. Sophisticated databases make it possible for retailers, for example, to connect a customer’s online survey results with his or her transaction details, including what was purchased, how much was spent, whether items were on sale, whether coupons were used, time of day, and the name of the cashier, desk clerk, or barista who served the customer.

DIY research

This approach to data collection makes it possible for the firm to collect huge volumes of data and to connect the customer’s rating of the service received to all sorts of details about what was bought and the neighborhood where he or she lives—all without having to resort to using the services of marketing research professionals. By designing a customer service or customer satisfaction survey that is tied into their POS system and customer database, companies can blanket their entire customer base or can select particular locations or time periods to identify a sample of customers on whose cash register tape the survey information will appear or who will be contacted via e-mail.

And, it’s much less costly than data collection used to be in the complicated old days when companies engaged research firms to do telephone research (does anyone even remember door-to-door or mail surveys?), because now there is no need to invest time and effort in contacting customers. Instead, just let customers decide if they wish to contribute their feedback. The Home Depot collects thousands of completed questionnaires for the price of a $1000 gift card. Starbucks obtained my valuable input for the cost (not the price) of a latte. Westin obtained it for free.

The result is that companies are getting phenomenal volumes of research data at very reasonable cost, but they may also be getting the research they deserve. There is a danger that retailers and others who are collecting customer feedback in this way actually believe that this form of customer research is sufficient. It’s typically very tactical, and not particularly insightful. Its power lies in its allowing the company to assess its customer satisfaction and service quality in considerable detail and to tie the ratings provided by the customer back to purchase behavior and other aspects of the interaction that have been captured already.

What’s missing?

Are the responses we are getting from online customer satisfaction surveys representative of our customer base? They are almost certainly not, as the views of those who decide to participate are not the same as those of customers who opt not to enter the lottery to win the gift card. Remember when we used to assess bias? There was a time when marketing researchers paid a great deal of attention to obtaining as unbiased a sample as possible. Today, with this opt-in kind of research, everyone’s opinion is welcomed and put into the pot.

Companies should use the money saved through the use of online research to invest in a deeper understanding of their customers.

Recent years have seen the disbanding of in-house marketing research departments in many large firms. One result is that there is no-one on staff to keep a watchful and informed eye on the quality of research that’s being done. In many firms, senior managers understand that some customer research is needed, particularly research that will allow them to assess customer satisfaction levels.

Many of these executives feel that online customer surveys are enough. They are getting their customer satisfaction information, but they may be missing out on deeper thinking on what motivates, disappoints, satisfies, frustrates, impresses or delights customers. Companies should use the money saved through the use of online research to invest in a deeper understanding of their customers.

Is this kind of research useful? Of course it is—provided it is done correctly and we acknowledge its limitations. Does it have a place in a company’s battery of customer research tools? Absolutely. But we should be asking some important questions about whether the information obtained is really representative of what our customers are thinking and about whether it allows us to really understand our customers.

Technology is not expertise

The availability of new technologies has enabled just about anyone to weigh in with his or her opinion. Because the technology exists and it’s easy to do, suddenly anyone can be a customer researcher.

Before you put all of your eggs into this online research basket, ask some questions that a trained researcher and statistician can answer. Only then should you be confident enough to make decisions based on the results.


  1. Jim –

    Great post. Funny, I posed nearly the same question to a group of my peers, in-house market research professionals, at an industry conference several weeks ago. Started out by asking them how many of them actually fill out those surveys they personally get from other companies (and, as I suspected, the answer was nearly unanimously no). So the next question began the discussion – “how, then, do you account for the bias in responses”?

    At the very least, I started them thinking and questionning.

    On the positive side, I do see some promise, at least on the big picture/broad opinion side, from advances in tools for listening to conversations in social media. Granted that the bias still exists based on 1) the % of customers that utilize social media, and 2) a large % are bi-modal posts in response to 3) a (hopefully) small % of engagements and/or transactions. Still, I would put more trust (albeit, a relative measure) in them than responses to cash-register receipt invitations to surveys. For instance, how could they survey someone who engaged, but did not make a transaction?


  2. One way of making ‘random’ web research more usable is to put some pre qualifiers on web research can help stratify this type of research, giving internal analysts the ability to cross reference respondent statistics with customer base or general population distributions. With this information they can also apply simple weighting for specific responder groups and smooth levels of over / under indexing.

  3. This is a much cleaner way of referring to the use of Social Media as part of a business strategy. Why cause confusion with terms such as social customer relationship management (social crm), customer relationship management (crm), Enterprise 2.0, and so many other terms people have contributed to the conversation. From here on out I will only refer to social business strategy (#SBS) as the framework, social customer relationship management (social crm) is dead to me. For those of you who were friends of social customer relationship management, flowers can be sent to my house as we will be having a private memorial service tomorrow.

  4. I am in the habit of trying to take as many of the surveys as I receive on register receipts as possible. Why? Several reasons I take the time. 1st- I want to find out if they are actually paying attention when a customer has a less than favorable visit to their business. My finding, maybe 35-40% are. They have contacted me to correct any issue I might have had. That is not real good percentages in customer service in my book. I try not to patronize the place again if at all possible if I get no responce (not automated either). 2nd- Who couldn’t use a $1000 bucks to spend at a favorite business? I have never even heard of anyone winning the $1000 offered so I am to the questioning point of this time spent. Honestly, as many as I filled out for the same 10 companies, I should have gotten a nibble on one anyway. Never saw it happen to anyone yet.
    3rd- If I have had a good experience I want the business to know & I make a point to compliment the worker. If I had a bad one, I want them to know about it! I don’t always wait for a survey to pop out. I have contact many a business using the contact us link to let them know just what the problem is, with whom, how upset I am, and an expectation of reply or don’t even bother I’m done with your business. So there is my 2 cents worth. And just for the record, don’t contests like these have to post the winners for public record and where do they post? I want to see someone that actually has won a $1000 like it says! Good Day all:)


Please use comments to add value to the discussion. Maximum one link to an educational blog post or article. We will NOT PUBLISH brief comments like "good post," comments that mainly promote links, or comments with links to companies, products, or services.

Please enter your comment!
Please enter your name here