Customer Satisfaction Survey Response Rates – Update


Share on LinkedIn

Response Rates2

Over the past couple days I’ve been researching the general subject of Survey Response Rates. My normal interest in the subject became elevated when I ran across an article having to do with the ACSI (or American Customer Satisfaction Index).
What caught my attention in terms of this email was the following quote –


“The American Customer Satisfaction Index found that response rates for paper-based surveys were around 10% and the response rates for e-surveys (web, wap and e-mail) were averaging between 5% and 15% – which can only provide a straw poll of the customers’ opinions.”


While it’s not news that electronic survey response rates have been steadily eroding for the past twenty years, I was very surprised to read that, in at least some cases, they were now performing similarly to the long-maligned paper survey. After reading the statement, various additional questions sprang to mind; chief among them, is that response range indicative of the entire industry, or is it a product of something that ACSI is doing? I must have visited 75 web sites in a search for the answer to that question. The results were decidedly mixed.

First off, of the thousands of web survey providers out there, I ran across quite a few claiming they had achieved some pretty lofty response rates. In support of supposedly “proprietary techniques”, I found companies claiming that they had achieved 30%, 50%, even 80% response rates. One company even claimed to have hit 100%, more than once. Most of those assertions, however, seemed to have caveats, both openly stated and inferred, attached to them; qualifiers like “a survey of a very small body of very closely intertwined customers”. In other words, many of the high response rates were probably based on having sent out ten survey invitations. After discounting those sort of claims, and after reading between the lines on other sites, it was clear that no-one anywhere was making claim that they can consistently hit numbers anywhere near those kind of totals. In fact, no-one anywhere seemed to make any kind of claims at all as to what they can consistently hit. No averages, no medians, no realistic expectations or long term histories of any kind.

Secondly, in my travels I ran across quite a number of academic and research company generated articles which, though presenting another fairly broad range of results, seemed to average out to a reasonable expectation of something in the 10% to 15% range, and probably trending closer to the lower number. I was not able to locate a definitive voice of the industry on the matter, but will continue looking when or as time allows.

One opinion that I did run into over and over again what that response rates to surveys in general – and of course most references on the web are to paper, internet or telephone – have ALL been declining over the past ten years. I find that imminently believable given our own history, which has been consistent with that trend. In the mid and late ’90’s, we consistently came in at 75% to 80%. By the mid to late ’00’s though, we dropped to closer to 70%, sometimes less. In the current decade we are so far running closer to 68%, and sometimes less.

There has been plenty of teeth gnashing and navel gazing around here in recent years as we have repeatedly tried to figure out why we are no longer hitting the kinds of numbers we used to routinely enjoy. We’ve reviewed our operations, our validation procedures, the content of advance notification letters, the callers we use, anything we could think of that might be having an impact on response rates and, with rare exceptions, we found nothing. The simple truth seemed to be that what worked like a charm in 1995 is simply not working as well in 2015.

There are two factors, however, that are difficult to escape. First, in 1995, customer satisfaction surveys were still a relatively new phenomenon. Companies and people were just starting to understand the value of surveys, and we had the clearly better mousetrap. In the intervening twenty years, however, everybody, and I mean EVERYBODY, has jumped onto the proverbial bandwagon. In 1995, surveys were an interesting novelty, an intriguing idea. Today they are everywhere. We are bombarded by them wherever we turn, often unable to avoid them, even when we’d prefer to. You can’t conduct business online, can’t buy something in a department store, can’t buy a light bulb at Home Depot without being asked to participate in a survey. It’s become a near glut, and like the trees in a forest, after a while you no longer even see them.

The second factor is the growth of informational incompetence among our clients. In the early 90’s we dealt with generally small companies, often “mom and pop” operations who generally knew their customers pretty well. Today we are mainly dealing with multi-billion dollar, multi-national conglomerates who have decrepit CRM systems, who take every informational shortcut they can when assembling a customer list, and who consistently have us trying to validate former employees, former customers, non-decision makers, and the dearly departed. In other words, a big part of our problem is application of the theory of garbage in, garbage out.
For me its been the Elephant in the Room for years. No-one talks about response rates and yet, particularly in the B2B arena where the typical organisation has only a few hundred customers, a good, high response rate is a key component to having feedback that is useful rather than being “interesting”. What do you think?

Now take a look at How to Increase Response Rates – 6 Useful Tips

John Coldwell
From an operations background, John's attitude towards B2B customer satisfaction surveys is that they must be useful. Interesting doesn't interest him. You should be able to grab the feedback by the scruff of the neck and do something with it. For the past 15 years John has been running InfoQuest's full-day senior-team post-survey workshops around the world.


  1. I think that people don’t see the value (of their time) in completing surveys because either they don’t benefit from it, or companies who ask questions in surveys, don’t actually do anything about the feedback.

    Also for instance, my bank sms’es me a survey after I’ve spoken to a consultant, but it’s at my cost. My feeling is – if you really and truly value my opinion, why should I have to pay to give it? it tells me that they are going through the notions and they don’t really appreciate my response. Sending me a survey just makes them LOOK good. I am not interested in superficiality, and perhaps many others are not either.

    OK, maybe I am not the one to ask, as one of my specialty’s is to increase customer satisfaction survey response rate and analyze the results, so I “know stuff” that many other general public won’t, but I still think that if companies really want to know, they will pay for it in some way – either by incentivizing the recipient, or by paying for the response via sms, etc.


Please use comments to add value to the discussion. Maximum one link to an educational blog post or article. We will NOT PUBLISH brief comments like "good post," comments that mainly promote links, or comments with links to companies, products, or services.

Please enter your comment!
Please enter your name here