I love the book Moneyball. I’m not particularly a baseball fan, but I love the lessons on how a counterintuitive approach can create disproportionate outcomes. I recently listened to the Freakonomics podcast “Did Michael Lewis Just Get Lucky with Moneyball?” and it reminded me of the lessons that Moneyball offers for us in CX.
If you’re not familiar with the book, here’s the excerpt from the podcast:
“Moneyball essentially covers a year in the life of the Oakland A’s, a baseball team that had previously spent a lot of money on players but under its current ownership was only willing to spend less than half of what big-money teams like the Yankees and Red Sox and Dodgers were spending. The orthodoxy in sports is that more money buys better players, and that better players will win more games. The Oakland A’s, out of necessity, challenged that orthodoxy. How? They hunted down good data and analyzed it ruthlessly to answer a simple question: when a baseball team wins a game, exactly why did they win? Which characteristics or behaviors were truly valuable, and which ones just appealed to the tastes of the orthodoxy? The A’s then identified which characteristics of winning were overvalued, and which ones were undervalued — and then they set out to acquire the type of player who possessed those undervalued qualities. Which, fortuitously, meant these players were cheap.”
My main takeaway from the book is that you can benefit by looking deeper at the outcomes that matter (runs scored, customers who are loyal or churn) than at what is in front of you.
Baseball scouts were traditionally distracted by “vividness” – easily visible qualities. A tall, strong player with a fast swing looks impressive, so scouts prioritized them. Those same scouts hated “fat catchers” who didn’t match scouts’ perspectives of what a baseball player should look like. Contrast the physical look of a player with the concept of plate discipline (rarely swinging at the first pitch, waiting for an ideal pitch to hit). The look is vivid; plate discipline isn’t. But the latter is far more predictive of runs. By focusing on the behaviors that mattered, Oakland was able to be far more efficient and effective – the same outcomes we look for in our CX programs.
So how do we apply that to our work?
We have the same issue with vividness as baseball scouts. One quote I’m sharing a lot recently is from Daniel Kahneman: “What you see is all there is.” All baseball scouts saw was the look of the player and the swing. Discovering plate discipline took more advanced analytics because what mattered was what you didn’t see – players with plate discipline swung at fewer pitches. That’s not glamorous, so scouts never picked up on it.
For us, NPS scores are the equivalent of a good-looking player. If what you see is all there is, all that we see are survey results. So, that’s how we evaluate the health of our customers. But this narrow focus on survey scores ignores the 90% of customers who don’t fill out our surveys. Our closed-loop processes don’t include these customers (because they’re based on survey results), nor does our reporting.
But there is a better way. We can Moneyball our programs by incorporating the Customer Ecosystem Data into our analysis.
The Customer Ecosystem Data is the descriptive, behavioral, operational, and financial data involved in the customer experience. When a customer decreases their order velocity, that may be because of a customer experience issue. We should investigate it. When customers require four calls to resolve an issue or file a claim, odds are that they’re unhappy. We should put them into the closed-loop process.
As I wrote in my last blog post, your first question should be to determine what behaviors you’re trying to understand, such as what causes customers to decrease orders from you. If this is your issue, look at some common behavioral and operational data, such as:
- On-time deliveries for this customer
- Open tickets
- Number of calls to the contact center
- Escalations to leadership
- Product quality issues
- Frequent changes to delivery dates
- Delayed implementations
Get the data historically, and you can see which of these best predict decreased order velocity. Use the same approach for retention – analyze which data has stronger predictive power. Only after you’ve completed that should you analyze your surveys to see if they add more context. Do customers who experienced the issue report that their issues weren’t resolved, that the agent wasn’t helpful, or something else? This is where surveys shine: by providing the “why” that the data doesn’t show you.
This type of analysis begins with a different approach than most in CX. Step away from surveys to understand the drivers of business success: revenue and retention. That’s what your business partners care about.
Take a page from Billy Beane, who led Oakland, and don’t look at what everyone else is looking at. Instead, determine what is most critical for success, and focus your efforts there.