I am not a big fan of using survey methodology for research into customer experience. I think it is a great tool for hypothesis validation, which is a part of customer experience research, but it is a very poor tool for learning about customers’ perceptions of dealing with a product or a company. You can never learn anything fundamentally new by asking closed ended questions. Surely, this point of view is not shared by most companies that pretend they want to know the opinions of their customers because they don’t want to learn anything that does not support their established beliefs.
However, if your tool’s box is limited to basics, at the very least you should learn how to use them well. I would argue it is better not to send out a poorly executed survey at all than to aggravate your customers with poorly timed and executed ones.
Here are the most common shortfalls of using surveys
- Timing – Practitioners often complain about the challenges of engaging customers to share their experiences. That is because 9 out of 10 attempts to engage (in a form of survey) are viewed by customers to be disruptive or poorly timed*. In other words, customers were asked to share their experience before they had an opportunity to sufficiently experience a product/service in question. The time and channel of engagement was determined by a company without considering the perspective or convenience of the customers. Advertising the self-centered nature of your company in the world of socially connected consumers, will not likely improve your brand equity, market share or whatever else you measure to get bonuses.
- Asking to quantify an experience a customer may not have experienced – The other day I got a survey from a company I like to do business with. This survey contains 30 questions, which is abusive in my opinion. 28 of them start with “On the scale from X to Y….”, while only 2 questions start with “Why…” and “How…”. About 40% of the questions asked me to quantify my experience with parts of the company’s service which I never had a reason or an opportunity to experience yet, but no provision is made in the survey to indicate this fact. So, such survey design leaves a customer with two options – to provide intentionally wrong answers or to ignore the request of participation in the survey. Which of these two outcomes would suit your survey design goals? This customer now does not feel as good about doing business with the company as he did before receiving the survey.
- Limit the number of words a customer can use for commentary – I do appreciate that an enterprise cannot “digest” unstructured data and you need to tabulate all responses into metrics. However, if you really want to authentically engage your customers you need to understand that the enterprise “digestion” issues are not very high on your customers’ pyramid of emotional needs. Humans share their experiences with stories, not numbers. If you limit their ability to tell their stories, you may never learn a critical insight that would lead to a dramatic business improvement. There are methods, techniques and technologies that can help to mine customer comments and quantify their sentiments without “outsourcing” your operational challenges to your customers.
The common theme of this post is simple – If you want to improve your customer engagement rate, be mindful and respectful of how your customers experience your attempts to engage with them.
* the results of a client sponsored research conducted by mining opinions of 12,783 customer comments.