Gaining control of your contact center surveys


Share on LinkedIn

learn more

The idea of Performance evaluation can strike fear in the most confident of individuals. So can contact center surveys. Do you remember what it feels like on the morning of an important test? A final exam? Or a presentation for that huge project? I remember very clearly the moments before my dissertation defense, thinking how those three people held my future in their hands. Am I prepared? Will the assessment be fair? What if it’s not fair – oh my – I can feel my heart rate accelerating just thinking about it.

Assessment is necessary, but it shouldn’t be a necessary evil. As you participate in the 29 Quality Assurance Mistakes to Avoid self-assessment, explore the idea “Is your assessment of the customer experience more important than your marketing department’s?”

Marketing should not control your contact center surveys

The Marketing department approaches assessment of the customer experience from a strategic point of view. Measurement is conducted using market research strategies to encompass a company-wide view of the customer experience. All of this is totally fine.

Unfortunately, when they own the contact center surveys the assessment of the customer experience is likely to come from three or four questions on their strategic company survey. For the contact center, this type of measurement becomes impossible manage to. It’s possible that the customer completing the evaluation had not even interacted with your contact center or did so months ago. This performance evaluation of your center is an unjust burden. The method behind any survey is critical to its value, and also to the lack of damage generated in the name of progress.

Gaining control of your contact center surveysContact center subject matter experts?

Are your marketing professionals subject matter experts on call center operations? Think about the three professors that judged my dissertation research. How fair would the assessment be if any of them were from the Chemistry department, for example. Sure, a chemist does research and presents results but would he or she be the best judge of consumer behavior research? Knowledgeable – yes, but the best assessment expert – no. The same example is true of your marketing department.

The marketing department may offer a high-level view of the contact center’s customer experience, but will the results provide you with any direction in staffing, coaching, training, process, or procedure changes?

Contact center surveys need a different scale!

Putting the methodology of a delayed measurement aside, how accurate is the analysis of performance when three or four questions are asked using a 5-point scale? First, it is difficult to construct an accurate assessment by measuring four constructs. It is not possible to adequately interpret survey results using the analysis from only four constructs. If you were to act on the analysis, it would be guessing and risky.

Second, the analytics permitted from 5-point scales are truly limiting. I see organizations that reveal their wisdom in surveying by reporting the percentage of 3s, 4s, and 5s to be “Satisfied”. That is ignorant. In order for satisfaction with the experience to equate to loyalty, only the 5s (on a 5-point scale) can be counted. Were you aware that it’s difficult to receive 5s because customers tend to lean to the 4s rather than to give perfect scores?

The 5-point scale does not have enough variability for the evaluators. So, it’s more accurate (and fair) to use a 9- or 10-point scale for performance evaluation and reporting, and survey scorecards. The Top 2-Box score can then be used to interpret loyalty and performance.

And third, influencing the customer experience occurs at the team level, if not the agent level. Do you find that the Marketing department’s assessment of the customer experience is helpful to your team leaders? It is not common for that to be the case. In fact, the general consensus is that when the information is focused on the company (is strategic in nature), it is painful noise to agents.

Take your contact center surveys into your own hands

You can better support your marketing department and their strategic goals by taking control of your contact center surveys. Because of the requirement for you to have a tactical-based approached that is linked to the agents, it’s most effective for you to take a lead.

You can easily include the strategic-focused items that the marketing team wants, but proceed with a robust measurement program that enables analysis to permit specific, behaviorally anchored improvement plans BY TEAM. What’s important to your customer experience assessment is the ability to quantitatively determine:

  1. the drivers of first call resolution
  2. the drivers of satisfaction with the agent experience
  3. the drivers of NPS (yes, you want to do that!)
  4. to be able to quantitatively report on the qualitative data collected (customer explanations for scores, including brand-level feedback)

Your contact center surveys can do this, but the marketing relationship surveys do not.

contact center quality auditIt may seem easier to let the Marketing department handle customer experience assessments, but isn’t the resulting pain from trying to explain/defend your performance metrics worse?

You need your own customer experience assessment program that is your defense, that is your guide for team by team coaching and improvement, and that is your go-to analysis of on-going or ad-hoc customer experiences. Take control of the situation and own your contact center surveys.

Republished with author's permission from original post.

Jodie Monger
Jodie Monger, Ph.D. is the president of Customer Relationship Metrics (CRM) and a pioneer in business intelligence for the contact center industry. Dr. Jodie's work at CRM focuses on converting unstructured data into structured data for business action. Her research areas include customer experience, speech and operational analytics. Before founding CRM, she was the founding associate director of Purdue University's Center for Customer-Driven Quality.


Please use comments to add value to the discussion. Maximum one link to an educational blog post or article. We will NOT PUBLISH brief comments like "good post," comments that mainly promote links, or comments with links to companies, products, or services.

Please enter your comment!
Please enter your name here