I took a three-day trip this week to the CXPA conference in Atlanta, and was asked to take a survey about almost every single part of the trip.
If you’re wondering why it’s getting hard to get customers to respond to e-mail surveys, this is why. I was asked to take surveys by the airline, the hotel, the conference, and even the hamburger joint where I grabbed lunch after arriving.
Most of these surveys were quite lengthy (only the survey about my visit to the airline club had a reasonable number of questions), and more than one of them had over 75 questions. Do I even need to add the fact that the overwhelming majority of the questions were completely irrelevant to my personal experience? Or the fact that some of the questions were so badly written that I couldn’t even figure out what they were asking?
All told, I was asked to answer somewhere between 200 and 300 questions about this short business trip. Most of the questions were irrelevant, and some were incomprehensible. I felt like my time had been wasted and my patience abused, all for an exercise which the companies clearly didn’t care enough about to do properly.
And that’s why it’s getting harder and harder to get customers to respond to e-mail surveys.
So how do you do it right? Here’s my advice:
- Keep it short and focused. For an online survey my mantra is “one page, no scrolling.” If you can’t fit the questions on a single screen, then your survey is too long. And while you may think there are 75 different things you need to ask about, the truth is that if you’re trying to focus on 75 things then you are focused on nothing.
- Proofread, test, and test again. There is no excuse for bad questions, but an e-mail survey can go to thousands of customers before anyone spots the mistake. Have both a customer and an outside survey professional provide feedback (not someone who helped write the survey).
- Communicate to customers that you take the survey seriously and respect and value their feedback. Don’t just say it, do it (following #1 and #2 above are a good start). Other things you should be doing include personally responding to customers who had complaints, and updating survey questions regularly to ask about current issues identified on the survey.
While these are all important things to do to build an effective online survey, the unfortunate truth is that the well is still being poisoned by the overwhelming number of bad online surveys. Customers have been trained to expect that any e-mail invitation to take a survey is likely to be more trouble than it’s worth, and so I expect response rates to continue to go down.
There’s both art and science involved in effective, and engaging, survey design. Today, it kind of takes a back seat to the desire for mega amounts of customer data; and there seems to be less prioritization of question inclusion (essential to have, valuable to have, useful to have, nice to have). One thing that both survey sponsors and developers usually fail to do is have a feedback loop for prospective survey participants, assuring then that they will receive a) insights as to the prupose of the research and b) summaries of aggregated survey findings post-facto.