This is the third post in a series on identifying performance gaps in call center agents. The first post focused on how to create a bullet-proof monitoring form, the second on how to develop a companion Call Quality Guide. Once these two tools are created and tested, ongoing call calibration will be the key to your success.
In a call calibration session, participants listen to calls—either before or during the calibration session—and score them according to the Monitoring Form and Call Quality Guide. Scores are shared and any discrepancies are reconciled, either by refining the forms if they’re newly created, or providing additional training for participants. The goal of call calibration is to assure that those responsible for scoring calls do so in a consistent manner.
If you’ve recently created or changed your Monitoring Form and Call Quality Guide, holding a calibration session will help you fine-tune the forms.
If everyone doesn’t agree that a Standard has been met, perhaps the Standard doesn’t have a yes/no answer and either may not be a Standard, or it needs further definition. If not everyone scores an Objective the same way, asking why will help you be more specific in how you state the behaviors that comprise each score. If your team doesn’t score calls the same way immediately, don’t despair! It often takes several hour-long sessions before everyone will begin to score a call in a uniform way.
Once your forms are fine-tuned and those listening to calls are scoring them the same, ongoing calibration sessions are crucial to hold—every week is not too frequent. Ongoing calibration sessions provide the following benefits:
- They assure that everyone who monitors and scores calls has a thorough understanding of the performance standards and objections
- Supervisors and QA monitors apply standards of evaluation uniformly
- Behavior is measured consistently—no matter which call is monitored or which person scores the call
- Agent satisfaction is improved as they get consistent feedback no matter which supervisor or QA monitor provides it.
- Because everyone—agents and leaders alike—know what is expected, coaching can focus on recognizing good performance and identifying opportunities for improvement.
How to Conduct a Call Calibration Session
Following are some best practices for holding call calibration sessions:
- Schedule an hour for each session.
- Select calls that are representative of the majority of the calls you receive, or calls to illustrate a particular issue. For some calibration sessions, listen to calls at random.
- Determine an acceptable variance in scoring. If you’re scoring newly created forms, if you’re scoring with newly hired supervisors or QAs, or if you’ve recently changed your forms, you can accept a wider variance—perhaps 10 points. Once you’ve held a few calibration sessions and fine-tuned understanding, 5 points might be the acceptable range. Don’t expect to achieve perfect calibration immediately! It may take several sessions before you’ll achieve a small variance. Focus first on calibrating Standards, as those are the behaviors most critical for job success.
- If calls are recorded, send them to the people attending the session, ask them to score them independently before coming to the calibration session, and record their scores, either in call monitoring software or by sending their monitoring form to you. This will allow you to see which participants may need individual help.
- Review the Call Quality Guide at the beginning of each session to be sure that everyone understands what makes up a successful call.
- If you sent recorded calls, post scores flip-chart or white board so that you can see the variance. If you didn’t send recorded calls, play the selected calls (or listen to live calls) and ask participants to score them one at a time. Discuss after each call.
- Ask one person to summarize the call.
- Review each Standard and ask participants whether the Standard was met, or not met. Discuss until each member understands why the Standard was met, or not met.
- Review each Objective and discuss the scoring variances and ask participants to explain why scored as they did.
- Do the same with the next call until you’re close to the end of the hour.
- At the end of the session, summarize lessons learned.
- Create notes from the session and distribute to participants for future reference.
Expect a lively discussion during your calibration sessions. Be sure to create a non-confrontational environment where everyone feels safe sharing his or her opinion. The goal of the session is not to prove who’s right or wrong, but instead, to be sure everyone understands the criteria for evaluating whether there are gaps in employee performance so that good performance can be praised, and performance gaps can be addressed.