Whether you’re leading customer experience at a B2B or B2C company, I think you’d agree that 2020 certainly had an impact on customer feedback. This year’s unpredictable and unrelenting changes challenged us in every area of our customer experience program, particularly in customer feedback quantity and quality.
At the B2B company where I work, we use surveys to capture the majority of our customer feedback. This approach was effective until COVID-19 hit. During the first months of the crisis, we noticed our survey responses were down by as much as 50 percent and sentiment skewed negatively. When we took a closer look, the skewed results were not service or product-related. They were due to the impact the pandemic was having on people’s lives and it reflected in their responses as well as non-responses. This is when I went to my colleague and resident experience management expert, Rick Blair, to figure out what we could do to create a better mix of direct, indirect, and inferred customer feedback sources. Rick suggested that we revisit how we are using or not using feedback sources in order to get the most out of our data and potentially discover new sources of data that will help fill the gap.
There are three types of feedback: direct, indirect, and inferred. These common sources of data come together to create holistic and systemic VOC customer experience coverage for businesses across a variety of markets. When our survey methods were impacted by the pandemic, we had to take a step back to determine what other sources would provide the right insights based on where our customers were going to get information and value.
The diagram below illustrates how we define the three feedback sources and the benefits of each.
Common Missteps when Collecting and Analyzing Customer Feedback
As I’ve gone through the discovery process to find the right balance, I found there are some common missteps that can be made as voice of the customer programs are developed:
- Assuming surveys are all you need to capture the voice of the customer. While traditional surveys are structured, stable and provide solid comparisons and benchmarks, they require a significant amount of time from your customers. And, as most VOC experts know, only a certain subset of customers respond to surveys.
- Not applying the right mix of the different sources of customer feedback. As I mentioned before, we were using direct customer feedback in the form of surveys to collect a majority of our customer data. For some time we had considered looking at indirect and inferred types of feedback and, after receiving the skewed results after COVID-19 took hold, we quickly realized it was time to stop talking about it and jump into action. We needed to bring other sources into the mix to get a more balanced set of results that reflect what our customers need and want.
- Not understanding how the sources can be used collectively to find value in data. For example, as a B2B company, indirect feedback doesn’t generate the kind of data it would for a B2C company. Our website is mainly designed for marketing purposes, to give potential customers information on the company, our solutions, recent case studies, news and other relevant content. Given this model, we thought surveys would be the best option to collect customer feedback.
- Thinking that one size fits all. We started to explore the use of behavioral analytics and found that one size does not fit all. Inferred feedback helps identify journey breakpoints by detecting and monitoring customer behavior. However, the journey that a customer has with one company is different than with another company. Therefore, you cannot copy another company’s use of inferred data; it has to be customized to your customer journey.
How to Find the Right Mix of Customer Feedback
I have a rule that seems to always be in play: when times are challenging, it’s the tactics that change, not the goal. This certainly holds true with the impact COVID-19 had on our customer feedback program. So when we learned we had to go in a new direction, we referred back to our CX goal which is delivering the best customer experience possible. With that in mind, I have taken the following approach to find the right balance of customer feedback sources:
Step 1: Understand where your customers are going to get the information they need and then understand why they keep coming back.
Marrying indirect feedback to direct feedback lets you define where to place direct options. Alternatively, you can use indirect triggers to prompt for direct feedback. For example, our customers rely heavily on the Verint Community. It’s here that they come to get information on new product updates, find new ways to use the solutions they currently have, collaborate with the Verint team, and share ideas with other customers. We’ve found that activity in the Verint Community reveals an abundance of insights into how our customers are using and finding information. Our customers have uncovered creative ways to solve CX challenges, submitted new feature requests, helped us identify unique use cases and tips on managing new remote work environments and more. Using the indirect (customer-initiated) feedback and, then prompting for specific feedback is providing powerful insights.
Step 2: Understand that the three sources of customer feedback are ingredients that work together.
Think about it: when you have a fabulous dish, you don’t think of the end result as one thing. It’s a combination of different ingredients that come together to make up the recipe. It’s the same idea with customer feedback sources. We’ve found that inferred data is invaluable and can easily be used with other feedback sources for a more robust understanding. For example, a bank can see how customers open an account online and can tell if they were successful or not. Did they end up having to go to a branch to finish a task? That’s a good piece of data that can be used to determine how to improve the experience. As a technology company, our customers start with the online Verint Community and, if the answer isn’t there, they go to Customer Support. Survey data might point to that successful or unsuccessful customer experience with Customer Support. But it’s also valuable to find out what was missing in the online community that caused our customer to open a ticket with Customer Support. We plan to drill down using both sets of data (direct and inferred) to get a better picture of the customer journey and identify where customers run into issues. Refining our ‘recipe’ will create a successful understanding of not just customer feedback at the end of an interaction but insight about what led them to initiate that interaction.
Step 3: Test and refine your approach.
Meaningful indirect and inferred data is specific to your journey so you should prioritize the sub-journey that is a top driver of overall customer satisfaction. And the first attempt at developing a good source of this data may not be the best. It’s important to continuously assess and refine the approach. As I mentioned, we started in our online community and implemented a customer-initiated source of data with a simple form. However, the data we received was sparse. We continue to refine the way we’re asking the questions, placing the form on the site, and interpreting the data. Most of our customers also mention that they continue moving closer to the goalposts although there is always an opportunity to refine. Once this is done, making the connection between direct and indirect feedback – specifically tying analytics data to feedback – is what powers a deeper understanding of the customer experience.
I’ve been on a mission to find the perfect mix of customer feedback since March, and now understand that it’s a continuous process. One that requires constant attention and refinement. If the pandemic has taught us anything—it’s certainly taught us about resilience. This year, we have honed our ability to address any challenge head-on and learned from our successes as well as failures in order to ultimately improve customer experience.
Nancy, I’ve not considered the sources of VoC in categories such as these. This is a very interesting way of categorizing them with regard to what they bring.
What sorts of methods would you put into each category? For example, NPS surveys are likely in the “Direct” category, and market analysis/SoMe ‘lurking’ would likely be “Indirect”. What other examples would you suggest?
Surveys are definitely in the “direct” category – as are speech and text analytics. For indirect feedback, think about customer-initiated feedback on your website, for example. It can be as simple as a form that customers fill out when they want to provide feedback (which provides trend and emotion data.) Inferred data is behavioral so this can vary depending upon your products and services. As a B2B technology company, we can look at behavior through the use of our products or, even, how customers look for support and resources on Verint Connect, our online community. Hope that helps!