Customer-Centric Thinking: A Collaboration of Man and Machine

2
479

Share on LinkedIn

Five percent of the people think; ten percent of the people think they think; and the other eighty-five percent would rather die than think.
—Thomas A. Edison

Of the five habits of customer-centric leaders, I suppose Think is my favorite. Maybe it’s because of my analytic bent from an early age, leading to a degree in Mathematics and an MBA which emphasized statistical analysis and operations research. I’ve always been intrigued about how to solve complex problems with methodologies and tools.

Or perhaps I was influenced by IBM, where I was employed for over 15 years. IBM founder Thomas J. Watson, Sr. actually created THINK while a sales manager at NCR in 1911. Frustrated at the lack of good ideas at a meeting, he proclaimed: “The trouble with every one of us is that we don’t think enough. Knowledge is the result of thought, and thought is the keynote of success in this business or any business.” After joining IBM he used THINK to foster a culture of using technology to help business leaders make better decisions.

And, of course, CustomerThink Corp. is the name of my firm and www.customerthink.com is where you’ll find the online community that has been a key part of my life for more than a decade. The name actually came from community input, when we held a contest to find the best name to convey being focused on customers.

Whatever the reason, I believe that making smart, fact-based decisions is absolutely essential to customer-centric leadership. I’m sure many of you reading this will say to yourselves, “We’re already doing that.”

Sorry, but you’re probably not. As you’ll learn, many decisions are made out of habit, not conscious thought. And others are influenced by biases we all have.

Read on, and you’ll also understand why I don’t advocate analytics or technology as the end-all-be-all to effective decision making. Human judgment will never go out of style. Not every insight can be put into a spreadsheet, analytic model or computer system. In fact, the explosion of Big Data and advanced tools will stretch human decision-making skills, not make it easier.

Oh, the Thinks That We Think

We humans often think we think better than we actually do. Why is that?

First of all, how can we make rational decisions about customers, when customers themselves don’t make decisions rationally? In a provocative 2008 book Predictably IrrationaI, behavioral economist Dan Ariely argues that people don’t make always make decisions by rational choice, weighing benefits of a potential action against the costs. David Berreby put it well in a NY Times review:

Yes, you have a rational self, but it’s not your only one, nor is it often in charge. A more accurate picture is that there are a bunch of different versions of you, who come to the fore under different conditions. We aren’t cool calculators of self-interest who sometimes go crazy; we’re crazies who are, under special circumstances, sometimes rational.

Just to pick one example from a book full of myth busters, consider how “decoys” influence our decision making. We are constantly looking at things in relation to others, and tend to avoid comparing things that cannot be easily compared. In his book Ariely illustrates with a story about honeymooners choosing between three travel packages: Rome with a free breakfast (A); Rome without the free breakfast (B); or Paris with a free breakfast (C). Instead of rationally comparing free breakfast options for both Rome and Paris, people will tend to pick Rome with the free breakfast because it’s clearly superior to a similar alternative. Option B is known as a decoy because it serves only to make option A more attractive.

The implication for business leaders is that consumers are harder to convince to do something dramatically different, even when it is a much better alternative. The safer route, in the short term at least, is to present choices similar to those already available in the market. That’s probably one reason why we see “New Improved!” labels on products with minor updates, and few breakthrough products. It’s our fault!

But there’s another factor at work: habits. A 2007 Duke study found that we are on automatic pilot for nearly half of the choices we make each day. The study concludes: “features of a person’s context (e.g., people, places, preceding actions) can be powerful, automatic triggers of habit performance and that habits are executed with minimal recourse to conscious monitoring.”

Also consider the ground-breaking work of Nobel Memorial Prize winner Daniel Kahneman, who summarized decades of research on human thinking and behavior in a 2011 book, Thinking, Fast and Slow. He found that we humans use two systems for thinking:

  • System 1: Fast, automatic, frequent, emotional, stereotypic, subconscious
  • System 2: Slow, effortful, infrequent, logical, calculating, conscious

Even in conscious “System 2” thinking people struggle to think statistically. For example, we are prone to overestimate benefits and underestimate costs. We are easily swayed by a small sample of readily available data that supports something we want to do, and ignore other factors that should be considered.

This “optimistic bias” encourages us to take on risky projects. and may explain why a good rule of thumb in IT project planning is to take the most conservative (highest) estimate of time and costs, and then double it! Projects almost never get done on time because we don’t allow for Unknown Unknowns. If you don’t believe me, take a look at your last home remodeling project. I’ll bet it cost at least twice as much as you budgeted, because of problems that surfaced after you started.

Torturing Data to Tell You What You Want to Hear

In case you were looking for a pitch about why computers and analytics hold the answer to human shortcomings, I’m sorry to disappoint you. Statistics can be just as misleading. As economist Ronald Coase famously said, “If you torture the data long enough, it will confess.”

A recent example of statistics abuse comes from my local newsletter, where a journalist wrote about the U.S. lagging behind other countries in average Internet speeds. With the U.S. “limping behind Latvia and the Czech Republic” in ninth place, he concludes this is “troubling news” because studies link faster Internet connections to the economy. Doubling internet speeds would increase GDP by .3%, which in the case of the U.S. would mean an additional $126 billion.

Wow! Why aren’t we doubling speeds right now? Where’s the hot line to the White House?

Was the columnist taking creative license with the report findings? Sadly, no. I reviewed the full report, sponsored by Ericsson, and found these conclusions in the section “The Socioeconomic Impact of Data Speeds.”

  • Doubling the broadband speed for an economy increases GDP by 0.3 percent
  • 80 new jobs are created for every 1,000 new broadband connections
  • For every 10 percent increase in broadband penetration the GDP growth is around 1 percent

Should you take the report at face value? You might want to consider the fact that Ericsson is a provider of technology to telecom operators. Needless to say, it has a vested interest in looking for conclusions in the data to “sell” the impact of faster Internet speeds.

The report says it used regression analysis, a statistical technique that attempts to relate one or more independent variables (broadband speed) with dependent variables (economic impact). But correlation doesn’t imply causation. While it may seem reasonable that advanced economies use technology more aggressively, it doesn’t follow that technology literally causes economic growth. In my view, at best broadband speed is probably just one of many different growth enablers; roads, bridges, power, schools, etc. also come to mind.

Actually, it’s possible that cause and effect works the opposite direction. As economies grow, businesses decide to invest in faster Internet speed. The correlation would look exactly the same, but the conclusion wouldn’t support Ericsson’s marketing message.

Confusing correlation with causation is just one of the more common mistakes in using analytic techniques. Yet correlation is frequently used to promote the idea that some new technology or trend will lead to business success. But only rarely is the research done in a way to show true cause and effect.

For example, excellence in Customer Experience (CX) is correlated with business performance. But I haven’t seen any studies that show that an improvement in CX is followed by improved business performance (after a suitable time delay), or that CX is the only factor involved in driving success. In fact, other loyalty studies show that product quality and price/deals continue to have a big influence on customer buying and retention. In my view, CX is an important factor in customer loyalty, but it’s being oversold as the main driver. For more on why this is happening, read Is the Importance of Customer Experience Overinflated? by analytics expert Bob Hayes.

Here are a few other common mistakes that confound people attempting to draw conclusions from data.

  • Over extrapolating results to a larger population. Let’s say you collect online survey responses in the U.S. Should these be extrapolated worldwide, including people who don’t use the Internet? Probably not, although I’ve seen this done quite a lot.
  • Drawing conclusions from insignificant differences. We’ve all seen political polls reported with a margin of error. If the difference between two candidates is within the margin of error, there is no “winner,” statistically speaking. But I rarely see margins of error or statistical confidence levels reported in business research.
  • Failure to use a control group. Suppose a marketer finds a particular sales incentive is correlated with higher revenue. To verify that “offer” is what’s driving sales and not something else, the incentive should be given to an experimental group while an otherwise identical control group is left alone. If the offer causes a (statistically) significant lift in sales vs. the control group, this is solid evidence of a causal relationship.

Data Scientists to the Rescue?

Getting valid insights from data and analytics requires specialized training and a lot of experience. The expanding world of Big Data has elevated the role of the so-called Data Scientist, which IBM sees as an evolution of a business analyst. Training should include: “computer science and applications, modeling, statistics, analytics and math.” The data scientist should also have the business skills need to identify valuable problems to be solved and influence business and IT leaders to make the right decisions.

That’s a tall order. McKinsey estimates the U.S. faces a shortage of 140,000 to 190,000 analysts. But a much larger 1.5 million professionals lack the skills needed to “understand and make decisions based on the analysis of big data.” To close the gap, I believe business leaders (that’s you) need to push their own development so they can ask better questions and avoid common pitfalls. I also think analytics vendors need to address the skills gap by making solutions more usable for business managers. According to marketing analytics expert Pelin Thorogood, CEO of Anametrix, that happy day is quite a few years down the road.

Until then, Man and Machine will need to work together to make the best customer-centric decisions.

Further Reading:
* One Thing You Need To Know About Statistics. Maybe Two (Bob Hayes)
* Correlation is Not Causation: Big Data Challenges and Related Truths That Will Impact Business Success in 2013 (Michael Lowenstein)
* What Really Drives Customer Loyalty? It’s Not Just About the Experience!

2 COMMENTS

  1. Bob: there’s a lot here that connects to my experience. Humans crave explanations–so much so that we’re suckers for them. Stephen Dubner’s book Freakanomics has a famous section about all reasons why the crime rate in major cities declined. Many people offered reasons why: better policing, better education, improvements in the economy . . . the list went on. Of course, the reasons given were all self-serving. The most plausible explanation wasn’t any of these (in case you haven’t read the book, I don’t want to spoil it by offering it here).

    One software company I sold for offered the sales team an elaborate EXCEL spreadsheet tool to plug in customer data to develop financial and operational outcomes for implementing my employer’s software. On the fifth tab was a stock price calculator–the inference being that implementing the company’s product would have a direct bearing on improvements in the prospect’s stock price.

    The tool was useful in a few ways, but for ethical reasons, I had to stop short from using that bit of analytical wizardry. To the point in your blog, that’s a place where correlation and causation becomes all fuddled. Nobody said selling isn’t an exercise in distortion.

    The correlation-causation issue can best be summed up with a quote: “Everyone who eats pickles dies.”

    Who can argue that?

  2. Bob –

    Great, incisive piece, especially the focus on the tyranny often coming from statistics, and the irrationality and lack of linearity among consumers; and thanks for the mention of my recent CustomerThink blog post: http://www.customerthink.com/blog/correlation_is_not_causation_big_data_challenges_and_related_truths_that_will_impact_business_s

    Not only should correlation not be confused with, or conflated to, causation, but the opposite is also true. In the post response given to Maz Iqbal’s comments, I noted that researchers and marketers, rather easily I’ve found, fall into the Venus Flytrap of logical fallacy. In the haste to make correlations work, they miss that the reasoning behind an argument or hypothesis can be flawed. And, by the way, it’s just as inappropriate to make the opposite assumption, that correlation proves causation, so when two events occur together they have a cause-and-effect relationship. This, from my long-ago memory of Latin, is “cum/post hoc ergo propter hoc” (“with/after this, therefore because of this”). Implying, hinting, or even suggesting, that there is connection between events, however, doesn’t make them real – in science or in marketing.

ADD YOUR COMMENT

Please use comments to add value to the discussion. Maximum one link to an educational blog post or article. We will NOT PUBLISH brief comments like "good post," comments that mainly promote links, or comments with links to companies, products, or services.

Please enter your comment!
Please enter your name here