Designing Hybrid Surveys


Share on LinkedIn

There’s two elements to designing a hybrid survey program which combines the depth of actionable feedback from a live-person phone interview with the ability to cost-effectively collect huge sample sizes with an online survey. In this article I’ll explore designing the survey questions and how the two feedback channels relate to each other. In a future article I’ll write about designing the survey process itself, and some of the considerations which go into sampling and channel selection.

To get the most benefit from a hybrid survey we want to play to the strengths of each feedback channel. Online surveys are cost effective for collecting tens of thousands or millions of survey responses, while phone interviews let you collect a lot of details about individual customers. Online surveys are good for calculating metrics, and phone interviews give you insights into the individual customers’ stories.

Keep The Online Survey Short and Sweet

The online survey is where you get to cast a very wide net, including large numbers of customers in the survey process. This is also where most of your tracking metrics will come from. But it’s not the place to try to collect lots of detailed feedback from each customer: long survey forms often don’t get a good response rate.

I recommend limiting the online survey to a handful of key metrics plus one box for customers to enter any other comments or suggestions they may have. The particular metrics you choose will depend on your survey goals, but I tend to think that one metric is too few, but more than five will just make the survey longer without yielding much (if any) new information.

It’s also good practice to give customers a Service Recovery option, usually as a question at the end of the survey along the lines of, “Do you want a representative to contact you to resolve any outstanding issues?” Just make sure that those requests get routed to the right department and promptly handled.

And please please please please don’t make any of your questions mandatory. Required questions serve no purpose other than frustrating customers and should be stricken from the survey toolbox.

Go Deep In Phone Interviews

You can ask a surprising number of questions in a typical five-minute phone interview. This is the place to ask follow-up questions, maybe include some metrics that had to be eliminated from the online survey due to length (you did keep it short, right?), and most importantly, give the customer a chance to really tell her story.

I usually start with the questions from the online survey and add to them. We may need to adjust the wording of some of the questions–not every question that looks good written will sound good when read aloud–but we want to cover the same ground. One of the purposes is to compare the results from the online survey to the interview, since we normally expect the interview to give us a truer reading of the survey metrics. If metrics for the interview and online survey diverge, that’s an indication that something may be going wrong in the survey process.

It’s a good idea to keep the interview questions flexible. Unlike the core metrics in the online survey, which need to stay consistent over time, the interview questions may need to be updated frequently depending on changing business needs or the particular reason a customers was selected for an interview rather than an automated survey.

I also bias heavily towards open-ended questions on the interview. This gives the customer a chance to use their own words and will often surface unexpected feedback. If needed, the interviewer can code the responses (along with providing a written summary) to allow for tracking of the types of feedback you’re getting.

The end result is going to be a handful of metrics, with a healthy dollop of open-ended questions to explore the reasons behind the ratings. The metrics should be comparable to the online survey, so it can serve as a check on the validity of the high volume feedback process, but the true value will be in understanding individual customer stories.

Republished with author's permission from original post.

Peter Leppik
Peter U. Leppik is president and CEO of Vocalabs. He founded Vocal Laboratories Inc. in 2001 to apply scientific principles of data collection and analysis to the problem of improving customer service. Leppik has led efforts to measure, compare and publish customer service quality through third party, independent research. At Vocalabs, Leppik has assembled a team of professionals with deep expertise in survey methodology, data communications and data visualization to provide clients with best-in-class tools for improving customer service through real-time customer feedback.


Please use comments to add value to the discussion. Maximum one link to an educational blog post or article. We will NOT PUBLISH brief comments like "good post," comments that mainly promote links, or comments with links to companies, products, or services.

Please enter your comment!
Please enter your name here