Customer experience (CX) improvement efforts rely heavily on the use of customer feedback to help companies determine where to make improvements and how to
make these improvements occur. This article will discuss how companies can use two types of customer surveys: relationship surveys and transactional
surveys.
Relationship Surveys
Relationship surveys allow customers to indicate their satisfaction about their overall relationship with the company/brand. Relationship
surveys are typically administered at periodic times throughout the year (e.g., every other quarter, annually) and ask the customers to indicate their
loyalty toward and satisfaction with the company across several business areas (e.g., product, service) over a non-trivial time period (6-12 months).
Not all CX touch points are equally important in driving customer loyalty. Relationship-level surveys focus on understanding different CX touch points that drive customer loyalty.
Relationship-level surveys help executives prioritize which touch points contribute most to customer loyalty. Relationship questions fall into four
categories:
Open-ended – survey questions uncover reasons behind the ratings
The combination of the CX satisfaction questions with the loyalty questions will tell you which factors need further evaluation via transactional surveys.
CX areas in which customers are dissatisfied and drive (highly predictive of) customer loyalty are referred to as key drivers. Improving
customer satisfaction in these key drivers will lead to increases in customer loyalty. Once a business knows where to make improvements
occur, the next step is to understand how to improve those particular experiences. This deeper understanding is uncovered via
transactional surveys.
Transactional Surveys
Transactional surveys let customers indicate their satisfaction with a specific event/transaction/interaction with the company, typically
revolving around a specific customer touch point (e.g., sales process, product quality, support quality, communication). Unlike relationship surveys,
transactional surveys are administered immediately or soon after the customer had a specific interaction with the company (e.g., support, sales, product).
The relationship survey results will guide what transactional surveys you need to do. CX areas that didn’t score high on customer satisfaction and are
important to driving loyalty should be a first priority for your transactional survey efforts.
Unlike the relationship survey where our focus was to understand the comprehensive customer experience over time, transactional surveys are focused on a
specific interaction/touch point. The focus of a transactional survey is on achieving a deeper understanding of what aspect of the experience left them
dissatisfied. If the relationship survey identified “technical support” as a key driver, a transactional survey on technical support would help identify
the specific ways you can improve technical support quality to improve satisfaction with technical support.
Each transactional survey will be inherently different because of the different questions needed to address specific interactions, transactional survey
questions will generally fall into four categories:
Overall Satisfaction with the Event/Transaction/Interaction – One survey question reflects customers’ overall evaluation of their experience
Customer satisfaction with customer experience – survey questions reflect specific touch points about the specific experience
Relative Performance – survey questions reflect your performance ranking against the competition
Open-ended – survey questions uncover reasons behind the ratings
Summary
Figure 1. Using Relationship and Transactional Surveys in your CX Improvement Efforts
When you think about customer relationship and transactional surveys, it’s best to think of them as complementary efforts in your quest to improve how you
do business. Your relationship survey helps you understand where you need to make improvements (e.g., product, service, marketing) while transactional
surveys help you identify what needs to be done to improve those experiences.
In other words, relationship surveys provide information to help with CX strategic decisions (e.g., what areas of the business you need to improve); transactional surveys provide information to help with tactical decisions (e.g., how are you going to make CX improvements happen). Figure 1 summarizes how relationship surveys and
transactional surveys fit into CX strategic and tactical decision-making.
4 COMMENTS
Great summaries, Bob. I hope companies are thinking about the ways that their VoC portfolio can create synergy in an end-to-end view of customer experience. For example, customer journey maps are a great way to discover the things that should be monitored in the relationship survey. Rather than track everything, or organize the relationship survey by your departments, find out through journey mapping — or just as well possibly, via text mining of customer-initiated feedback — what customers really want to give feedback on. And then set the cadence according to the degree of change that occurs in market perceptions in the industry, in combination with how quickly your organization can make noticeable differences between relationship survey deployments.
For transaction surveys, I think sampling is used less often than it should be — seems that nearly everyone is asking the whole population to reply to surveys. I suppose that’s due to an effort to tie VoC to individual employees or departments. It would be better if it was made obvious to customers where they could go anytime they want to give feedback (and see progress reports there of what the company is working on), but not to be invited explicitly for every transaction they do with a company. I guess we could call that “invitation fatigue”. And it would be best if companies tied performance assessments/pay to internal metrics of employee/department behaviors/actions rather than burdening customers with constantly providing judgments. We did this at Applied Materials where I led VoC and customer experience improvement for many years.
I think it would also move everyone ahead if we re-think phrasing of indexes and satisfaction questions. In place of asking how well someone or something performed, what would be the impact on the respondent and on insights and actionability if we ask: “how well did this help toward your objectives?” . . . And knowing what the customer’s objectives are (i.e. job-to-be-done; capability sought) provides context that can be used for segmenting responses and making a better match of actioning with outcomes that customers will reward (i.e. CX ROI).
Bob I agree that it’s important to have both transactional and relational surveys in the arsenal of an organization, but only if as you say, the company is set up to do something about the results.
I agree with Lynn and am a zealot of mapping results across the journey, versus cherry picking by survey question and silo what gets examined. These cause silo based and splinter projects that don’t improve an experience completely.
I’d like to assert that in addition to using what I think of as ‘aided’ or ‘push’ feedback, that companies also blend in two other kinds of listening, also mapped to the journey stages. The second type is ‘unaided’ – meaning all the feedback your customer is volunteering to you. Find the top three or so with the most tonnage and create a common categorization of the issues or ideas so they can roll up in volumes hard to ignore. Then also engage different levels of your organization in what ‘experiential listening.’ This means if you require customers to create an account, that your company people also have to create an account. Take all the customer required processes and dole them out and have people walk in the shoes.
When you can blend all three types of this listening, mapped to the journey stages, people become more personally involved. The customer comes off the spreadsheet or dashboard and shows up as human, and volumes of trends that converge across multiple listening types create focus for action.
My two cents. Thanks for the great article!
J
I like how simple but how useful the 4 points are summarised. My company needs a lot of CX optimisation and especially when it comes to clients coming from our website. With the amount of visits the website should be making most of the company’s profit but at this stage it’s conversion is not quite good. I know this doesn’t have much to do with conducting a survey, but this could be one way to understand what’s wrong. This article could really prove a pivotal point for a new customer experience program that will be targeting to improve both the CX but also the User Experience.
Great summary. How do I get a list of highly respected Relationship survey vendors?
Great summaries, Bob. I hope companies are thinking about the ways that their VoC portfolio can create synergy in an end-to-end view of customer experience. For example, customer journey maps are a great way to discover the things that should be monitored in the relationship survey. Rather than track everything, or organize the relationship survey by your departments, find out through journey mapping — or just as well possibly, via text mining of customer-initiated feedback — what customers really want to give feedback on. And then set the cadence according to the degree of change that occurs in market perceptions in the industry, in combination with how quickly your organization can make noticeable differences between relationship survey deployments.
For transaction surveys, I think sampling is used less often than it should be — seems that nearly everyone is asking the whole population to reply to surveys. I suppose that’s due to an effort to tie VoC to individual employees or departments. It would be better if it was made obvious to customers where they could go anytime they want to give feedback (and see progress reports there of what the company is working on), but not to be invited explicitly for every transaction they do with a company. I guess we could call that “invitation fatigue”. And it would be best if companies tied performance assessments/pay to internal metrics of employee/department behaviors/actions rather than burdening customers with constantly providing judgments. We did this at Applied Materials where I led VoC and customer experience improvement for many years.
I think it would also move everyone ahead if we re-think phrasing of indexes and satisfaction questions. In place of asking how well someone or something performed, what would be the impact on the respondent and on insights and actionability if we ask: “how well did this help toward your objectives?” . . . And knowing what the customer’s objectives are (i.e. job-to-be-done; capability sought) provides context that can be used for segmenting responses and making a better match of actioning with outcomes that customers will reward (i.e. CX ROI).
Lynn
Bob I agree that it’s important to have both transactional and relational surveys in the arsenal of an organization, but only if as you say, the company is set up to do something about the results.
I agree with Lynn and am a zealot of mapping results across the journey, versus cherry picking by survey question and silo what gets examined. These cause silo based and splinter projects that don’t improve an experience completely.
I’d like to assert that in addition to using what I think of as ‘aided’ or ‘push’ feedback, that companies also blend in two other kinds of listening, also mapped to the journey stages. The second type is ‘unaided’ – meaning all the feedback your customer is volunteering to you. Find the top three or so with the most tonnage and create a common categorization of the issues or ideas so they can roll up in volumes hard to ignore. Then also engage different levels of your organization in what ‘experiential listening.’ This means if you require customers to create an account, that your company people also have to create an account. Take all the customer required processes and dole them out and have people walk in the shoes.
When you can blend all three types of this listening, mapped to the journey stages, people become more personally involved. The customer comes off the spreadsheet or dashboard and shows up as human, and volumes of trends that converge across multiple listening types create focus for action.
My two cents. Thanks for the great article!
J
I like how simple but how useful the 4 points are summarised. My company needs a lot of CX optimisation and especially when it comes to clients coming from our website. With the amount of visits the website should be making most of the company’s profit but at this stage it’s conversion is not quite good. I know this doesn’t have much to do with conducting a survey, but this could be one way to understand what’s wrong. This article could really prove a pivotal point for a new customer experience program that will be targeting to improve both the CX but also the User Experience.
Great summary. How do I get a list of highly respected Relationship survey vendors?