You know that capturing post-call customer feedback is critical to your business. It allows you to keep your finger on the pulse of the customer, uncover problem products, agent issues, service faults and organizational barriers. Additionally, it reveals the positive in what is working, who is performing at high levels and quantifies how the customer experience translates into customer satisfaction and loyalty. If you are not getting these things, you are missing it. You may think that your customers are inconvenienced by being asked to participate in a post-call IVR survey. If done correctly, they will not be.
As you know, Customer Relationship Metrics conducts free Customer Insights to Action assessments on post-call IVR survey programs. Many of those who take advantage of this service do so because their current customer experience measurement program is not yielding information that can be used for driving process improvements inside the contact center and for the enterprise.
When people talk about their post-call IVR survey process or putting in a new program, we often we hear, “it must be short”. Somewhere some executive, or self-proclaimed expert, has said “make our post-call survey no more than five questions because I don’t think our customers will want to answer more questions than that.” It’s funny to think that a person is wanting to collect customer feedback and has never tested and analyzed the tolerance level of customer for completing surveys. That’s right, customers should tell you when long is too long. Because I can tell you this with certainly, five questions are not enough to conduct the analytic analysis needed to impact big changes. You want to changes processes? No way you will do it with a five question post-call IVR survey.
With so few questions your entire investment of your resources and the customers’ time is wasted. Sometimes your resources are even more than wasted because survey malpractice occurs and you use the information incorrectly. There most certainly is a balance to strike when it comes to the right number of questions you are ABLE to ask on your post-call survey, and your customers know the answer better than you. Isn’t that why you want to measure anyway; to know what your customer think?
We at Customer Relationship Metrics have been conducting post-call surveys for more than 15 years and thought it would be fun to dig through our archives to find some of our favorite Knuggets and Knuckleheads blog posts about post-call IVR surveys and make them easy for you to access. Even in a ‘knucklehead’ comment there are ‘knuggets’ of wisdom to be gained!
KNUGGETS AND KNUCKLEHEAD ARTICLE LINKS
Post-call surveying can be one of your most powerful tools to capture the customer’s true experience with your products and services. At best you hope that through surveying your customers’ experiences we can improve processes going forward and eliminate customer dissatisfaction. Take this quote for example:
“This representative did not listen. He was asking questions that he was scripted and should ask, but he wasn’t understanding or processing the answer he received. I had to repeat myself many times and was still not heard.”
When you receive many comments like this, it is time to go back and do some targeted contact center agent re-training. Without an immediate formal customer feedback program it’s not likely you will discover issues like this until the problem is chronic and huge. Without digging into the customer experience and finding out truly what the customer pain is – instead of assuming you know – you will continue to waste money on the wrong areas of the business. Spend your time and resources where they effectively improve processes to benefit customers and thereby the business.
Of course not all survey data is good quality data that gleans rich customer insights, but it can make you laugh out loud like some of these choice comments below:
“Thank you again for your help. I appreciate what you were doing. You were very nice. I really have to go to the bathroom but I thought I should take this survey to give you credit because you were so good. You’re a great person. I have to go now, literally. Bye-bye.”
“Hi. I just spoke with you and you helped me with my problem. I would like to thank you. You did a very nice job. If you were a stripper I would have left you a tip. Bye.”
“If you ever go down to San Jose, you can give me a call. I think you sound hot. I’m down for some booty-smacking. So my number is XXX-XXX-XXXX. I hope to hear from you baby. Later.”
This is the perfect segue to talk about survey calibration and the importance of removing comments like the racy one above when reviewing your post-call survey data. While these crazy customer comments are certainly good for a laugh, you wouldn’t want these sorts of comments reaching your agents’ report cards. The beauty of the survey calibration process is the opportunity to find ‘knuggets’ of wisdom that deliver business intelligence from the voice of the customer but keep the result only in the aggregate.
Hopefully this Knuggets and Knuckleheads trip down memory lane reminds you that post-call IVR surveys have their place and purpose, and when executed correctly can uncover true business intelligence about customer dissatisfaction to improve business processes. Settle for nothing less.
Nice introduction to post-IVR survey but fails to point out some of the drawbacks to an automated survey like this. Certainly an automated survey like this is better than nothing, but users need to understand the dark side too.
Yes, once in place this is a really inexpensive way to get some customer feedback but:
• IVR survey opt-in rates are normally very low; typically low single digits so you are not getting feedback from a good cross section of you customers
• People don’t like to talk to robots and will abandon the survey if it is too long. This significantly limits the number of questions you can ask.
• Results are often bi-modal, people that respond are very satisfied or very dissatisfied and you miss the feedback middle.
• Normally, no opportunity to ask a follow-up question, like: Why did they rate you a 1 on a 1-5 scale?
• If there is an open response question, which I recommend, many systems do not have this capability, you have to have a way to monitor the feedback. If your getting 100's or 1000's, this becomes quite time consuming and you may need to invest in a speech analytics tool to stay on top of this.
• Make sure the agent does not interface in anyway with the survey, they can significantly bias the results