We’ve all gotten a survey invitation on a store receipt.
A 2016 study from Interaction Metrics found that 41 of the 51 largest U.S. retailers included a survey invitation on the standard receipt. The surveys were evaluated to see how useful and engaging they were.
Not a single one was fully engaging and scientific.
Collect and act on NPS-powered customer feedback in real time to deliver amazing customer experiences at every brand touchpoint. By closing the customer feedback loop with NPS, you will grow revenue, retain more customers, and evolve your business in the process. Try it free.
The study also found that 68 percent of the surveys were “total garbage,” meaning the surveys were so flawed they weren’t worth the time required to complete them.
You can view the entire study here. Below is a summary of the results along with some action items and resources to help improve your organization’s customer satisfaction survey.
How the Study Worked
The study assessed surveys based on four criteria. Each one was weighted to reflect the relative importance of each category:
- Access: Ease of locating and beginning the survey (5%)
- Branding: Style reflecting the brand, correct spelling and grammar (10%)
- Engaging: Keep customers engaged throughout the process (35%)
- Accuracy: Survey design that yielded accurate data (50%)
The surveys were all obtained by making purchases from the retailer, either in store or online.
Accuracy Flaws Uncovered
Inaccurate data can prevent companies from taking the right action to improve service.
Or worse, a survey might be gamed to yield high scores that disguise the fact that service needs to be improved at all.
Asking leading questions was one of the most prevalent flaws, showing up in 92 percent of the surveys examined. These are questions that are worded in a way that naturally leads customers to a particular answer.
For example, Ace Hardware had this question on its survey:
How satisfied were you with the speed of your checkout?
The problem with a question like this is it assumes the customer was satisfied. This assumptive wording makes a positive answer more likely.
A more neutral question might ask, “How would you rate the speed of your checkout?”
Another issue was the use of overly positive wording that can bias a customer’s response. The study found that 82 percent of surveys contained at least one question with overly-positive wording.
Here’s an example from GAP:
The look and feel of the store was very appealing.
This question also suffers from vague wording. Does “look and feel” refer to branding such as signage, displays, and decor? Or does it refer to cleanliness and organization? Perhaps it means the store’s layout?
Here’s an example from the now-defunct Sports Authority, where a cashier biased the survey in another way. He stamped the expected response right on the invitation:
Engagement Flaws Revealed
Surveys reflect on your company’s brand.
They’re part of the customer journey. Many retailers have made their surveys so needlessly long or aggravating that the survey itself reflects poorly on the brand, like this egregious example from Buffalo Wild Wings that required customers to navigate through 39 different screens!
The average retailer’s survey had 23 questions.
That’s a tedious amount of questions to expect customers to answer. Nordstrom advertised its survey took just 2 minutes, but it contained 25 questions. The survey actually took 4 minutes to complete.
The study found that 13 percent of surveys were difficult to access. Walmart required not one but two receipt codes to be answered. Rite Aide, Ross, and Walgreen’s all had broken links.
The best surveys are short and easy to complete. In many cases, you can capture troves of useful data with just three questions.
There are many resources to help you develop, implement, and refine your customer service survey while avoiding these mistakes. Here are just a few: