What two airlines tell us about avoiding survey fatigue

0
34

Share on LinkedIn

Customers aren’t tired of surveys. They aren’t tired of giving their opinion and suggestions. Fatigue only sets in when they feel bombarded with requests. Response rates can drop when surveys are too long or the topic / questions feel irrelevant, insensitive or poorly timed. Some customers simply think “this is too much effort” or “what’s the point”. Some believe that participating will have little impact.

As with all things customer experience, there’s nuance here. Different customers groups respond differently. They have different expectations. Your customers will tell you if your surveys are well-executed, or you’re missing the mark.

None of us are immune from poor survey design. I wanted to share my experience of two airline surveys after a long-haul flight. Not to call out any specific carriers, but to share some thoughts on how to avoid the mistakes that put off customers. I want to walk you through the steps you can take to ensure your customers are motivated. You get them to the end of the survey, and are able to glean valuable insights from the feedback. And perhaps most importantly, they want to complete the next one you send.

But first, what do customers think?

Survey fatigue – what do the numbers say?

HubSpotran some numbers on survey fatigue. Here are just a couple of insights on what people had to say about the last survey they had filled in.

  • When do you lose people? After three minutes you lose nearly 15% of people, after nine minutes you lose more than 40%.

  • What tired people? ‘Too many questions’ at 23.4%, ‘weren’t motivated to answer questions about the survey topic(s)’ at 10.8% and ‘unsure what impact the survey responses would have’ at 8.9% were all top reasons why people felt surveyed-out.

  • What motivated people? 70.9% of respondents felt more energised if a survey included viewing a video or photo.

How can you avoid frustrating and losing participants with the way you design your surveys? Let me share my experience – there are valuable lessons here.

What puts people off?

Misjudged personalisation – you don’t know me

If I’ve just experienced a service or bought something I don’t really want to have to answer a series of questions asking me to tell the company what I’ve bought or experienced. If you’re running a colleague survey, do you really need someone to tell you if they manage a team or how long they have worked at your company?

As I mentioned earlier, after a recent long-haul flight I received two surveys – both of which asked me to tell the airline which flights I’d taken, the times, and in one case the seat I had sat in. The airline had created a survey experience that didn’t feel like they knew who I was or they were that interested in my feedback.

Thoughtful survey design means avoiding asking participants to repeat information that you already have.

Long surveys – but how long is too long?

Survey length is a real irritant. The airline surveys asked me 8-10 questions before they asked anything about the flight. One of the surveys had a total of 28 questions.

When designing a survey, have some respect for your customer’s time and don’t bombard them with too many questions (many of which are unnecessary). When someone answers a question yes, and 4 new questions appear below – that can be really annoying. Smart respondents change the answer to no to avoid answering so many questions. (Did you use the in-flight entertainment system? Yes. Please answer these additional 4 questions about the in-flight entertainment system). No.

That’s not to say that short surveys are always welcomed. Surveys that are sent out after contact or a transaction has been completed need to be managed carefully. If I contact a mobile phone provider they send me a short survey, usually by text, after the call to ask for my feedback. If I contact them the next day to follow up, they will send me another survey; this quickly becomes very annoying and customers will opt out or just ignore the invites.

As a general rule the shorter the better, but if a survey is well promoted and customers feel their opinion will make a difference then longer surveys work well. We recommend regular customer surveying where invites are sent out every month to a different group of customers either following a transaction or part-way through a relationship cycle. These tend to be short surveys 3-8 questions and are carefully managed so as an individual does not receive more than 4, relevant, invitations a year.

Radio silence – not knowing what happens with the feedback

Survey responses are collated and reports are written and circulated to executives but customers are never told what has happened with the feedback. Did it make any difference? Is the business really interested in my feedback? Worse still, feedback responses are not communicated to the people who look after that customer. In a B2B environment, individual account managers / relationship managers need to know when their customers have given feedback and what they have said.

What else can you do to optimise your surveys?

Pay attention to detail

This sounds really obvious but you’ll surprised at how many customers receive a survey that starts Dear 07973 555456. You need to ensure that email addresses are current and that data fields are correctly recorded. GDPR has meant that most businesses have a much better approach to ensuring data quality but paying attention to detail is paramount.

Avoid bias

Start with all customers and try to manage a process that asks everyone about their experience. You wouldn’t run an employee opinion survey and not ask all colleagues, would you? You can review your customer segments and apply weighting after responses come in if you’re calculating scores or metrics but starting with everyone, not a pre-selected sample is the best approach.

What’s a good response rate?

How long is a piece of string? We have a client that runs a 20-question survey every year and gets an 80% response rate from one particular customer segment, we have others where they regularly get a 90% plus response rate on their employee opinion surveys. As a general rule, you can expect a response rate of around 10% when you first run a customer survey, 40-60% on colleague surveys. We’ve found that surveys mandated by the new Consumer Duty regulations are getting higher response rates than stand customer experience surveys.

There are many techniques we use to improve response rates, but at the end of the day, they all involve good communication and follow up with respondents.

What’s the best day to ask for feedback?

Let’s think differently about this. There’s lots of advice out there on survey timing. But, if everybody follows the same advice (send a B2B survey on a Monday, say) then this could be counter-productive. There’s only one answer. It’s when your customers are most responsive.

Some final thoughts

Everything boils down to the fact that a customer or employee survey is a customer or employee experience in its own right. So businesses need to apply the same care and attention to detail as they do when designing their customer experience.

As my airline experience demonstrated, well-executed surveys are built on three key principles: make it personal, respect my time, and make me feel like my feedback is important (tell me what you have done as a result).

There’s no such thing as a perfect survey. So don’t get hung up on it. Start surveying, get some responses and review them to determine if you’re getting what you need and adjust it if necessary; expect to make changes to the surveys in your programme. I’m sure nobody is still asking questions about how their business responded to COVID-19.

Republished with author’s permission from original post.

Charlie Williams
Charlie is a customer and employee experience expert and Net Promoter Certified Associate. He has been working with organisations since 2006 helping them understand their customer experience and develop programmes that create a customer centric culture, drive advocacy and employee engagement. A specialist in the Financial Services sector, he helps clients to capture evidence of Consumer Duty outcomes.

ADD YOUR COMMENT

Please use comments to add value to the discussion. Maximum one link to an educational blog post or article. We will NOT PUBLISH brief comments like "good post," comments that mainly promote links, or comments with links to companies, products, or services.

Please enter your comment!
Please enter your name here