In the last week post I wrote about the reasons the examples of B2B Customer Experience Management successes and failures are not as widely available as B2C ones. I also started describing a specific example of pitfalls on a journey of CX discovery. This post is about negotiating these pitfalls and translating findings into actions.
The suggestion was made to take another stab at the problem, but from a different perspective. Until now all inquires were focused on the product. Perhaps focusing on why the visitors downloaded the free version, or even why they came to the website on the first place, would provide insights that would help to increase the conversion rate. It was clear that the free version was not sufficiently meeting customer expectations as only 18% were still using it for “minor” projects after 6 months from a download. What was not clear is what kind of projects did they hope to tackle with the product. This is not an easy question for tabulating answers. However, a clear and statistically representative answer would help us understand if our website attracts “wrong” customers, i.e. there is a mismatch between problems they have and solutions we offer.
Interestingly enough, the “survey” (with open ended, free format, questions that focused on customer’s experience, instead of company’s problems) generated over 13% response rate. The automated analysis of this feedback exonerated the digital marketing team – they have been attracting “right” customers—but the customers’ actual experience with the product did not live up to expectation created by the marcom. The detailed analysis of collected customer feedback produced a list of CX attributes in order of their importance to the customers, as well as a measurement of delta between their expectations and experience (example). Two issues stood out, as they caused customer disappointment by 25% and 38% respectively – customer support and usability. The first finding incensed the customer support team, who stormed in armed with NPS=87. The customer support team was serving only paying customers, while the fremium ones were supported by other customers (community) and an automated knowledge system. That caused us to separate the customer feedback data by paying and fremium customers contribution. Indeed, paying customers did not experience any disappointment with customer service. Usability was a problem for them as well, although with much lower impact.
A deeper dive into their comments produced the following insight: the product has sufficient functionality to meet target customer expectations, but only the savviest and/or most technical users are capable of figuring out how to draw on this functionality to achieve the goals they expected the product to deliver.
Plan of action:
- Extend full customer support (2nd tier/priority) to fremium customers and offer 1st tier as a paid option without conversion to a paid license.
- Launch-process-driven. i.e. customer-centered, UX study to learn how to simplify use of the product
- Re-design the product front end based on the UX study findings.
Afterword
Implementation of the first step of the plan resulted in 4% of additional revenue from increased conversion ratio after the very first quarter. Learning from Zappo’s experience, the management shifted 20% of the marketing budget to customer support, which is now considered a revenue generating department. The subsequent steps, when gradually implemented into production, reduced customer support load to below original cost per customer.
Most of us are very focused on what we think we do – product people are product-centric, customer service people are support-centric, etc. – but we all are in business of delivering the best customer experience, and we should excel in our part of it without losing a focus on the big picture.