Skepticism: An Antidote for Statistical Malpractice

Andrew Rudin | Sep 14, 2017 44 views No Comments

Share on LinkedIn

Michael Shermer, founder of the Skeptics Society, doesn’t suffer fools. He questions assertions that others accept as fact. He challenges claims of “scientifically proven” when he doesn’t see any science. He examines experimental hypotheses, and weighs research methods and data. “The principle is to start off skeptical and be open-minded enough to change your mind if the evidence is overwhelming, but the burden of proof is on the person making the claim,” he says, adding, “I would change my mind about Bigfoot if you showed me an actual body, not a guy in an ape suit in a blurry photograph.” [emphasis, mine]

Just like those grainy images of Bigfoot, marketers often use shaky evidence to support contrivances of irrefutable proof. They crow that numbers don’t lie, and latch onto statistical tidbits to drive home their points. “Studies show . . .” The cliché preamble to a sales pitch.

“Hands down, the two most dangerous words in the English language today are ‘studies show,’” Andy Kessler wrote in a Wall Street Journal editorial, Studies are Usually Bunk, Study Shows (August 13, 2017), “If a conclusion sounds wrong to you, you’re probably not a hung-over grad student.” Snarky, but I get his point. Heavy partiers make poor skeptics. What about the rest of us?

Marketers routinely spin study percentages into clickbait. A Frinstance: 39 Shocking Sales Stats that Will Change the Way You Sell, from which I drew this sampling:

  • Email marketing has 2x higher ROI than cold calling.
  • 92% of all customer interactions happen on the phone.
  • 92% of salespeople give up after four “no’s”, but 80% of prospects say “no” four times before they say “yes”.
  • 44% of salespeople give up after one follow-up call.
  • 68% of companies struggle with lead generation.
  • 50% of sales time is wasted on unproductive prospecting.

The article gives separate sources for each of these nuggets, but from there, tracing their provenance becomes convoluted.

No matter. Many stats get bandied without scrutiny. They’re embedded in tweets that are liked and re-tweeted. The shares are shared, and those shares get re-shared. And on, and on. Along the way, the original meanings of many numbers get warped or over-amplified. Flawed findings mutate into hallowed truths. As these clipped numerical snapshots flow through the social media pipeline, they lose meaning, and transform into verbal nothingness: “44% of salespeople give up after one follow up call.” Since I don’t know the research definition of give up, how it was measured, or the operational meaning of a follow up call, it’s impossible to draw any insight from this statement. But the person who shared it in this article has a goal, which is to change the way I sell.

“There’s some kind of weird thing that happens to people when mathematical scores are trotted out. They just start closing their eyes and believing it because it’s math,” says Cathy O’Neil, author of the book, Weapons of Math Destruction. Statistics persuade. That’s often the point of using them. They can give authoritative glamor to sales pitches. They can also be used for malevolent purposes, like distracting prospects from seeing truth, as Coca Cola recently demonstrated.

Coca Cola funded faux research through the now-defunct Global Energy Balance Network (GEBN). The objective was to promote the falsehood that preventing obesity didn’t depend on people eating less, or (most importantly) drinking less soda. Instead, the wonks at GEBM told us that people just need to be more active. A message Coke knew that people wanted to hear, and advanced their revenue objectives at the same time: Getting thin didn’t mean anyone needed to give up their habit of swilling a 40-ounce Big Gulp.

Try this out: In a search window, type, “% of marketing content goes unused by sales” – quotes and all. Just now, my top three results (out of 3,150) are “80%,” “70%,” and “60%” respectively. Further down the stack I see “90%.” If you’re not sure what to believe, you’re in good company. Still, I get a strong vibe: content sucks. And not just content in general, but my content.

The not-subliminal message: “Be really, really dissatisfied with your content strategy, and remember, we content gurus can fix it!” Expectedly, with every link, I found a company with a product or service to dig prospects out of a rut they probably didn’t even know existed.

Jordan Ellenberg, author of the book How Not to Be Wrong, refers to the practice of hyping stand-alone percentages as statistical malpractice. By cherry-picking a study’s percentage or finding and stripping away its context, additional results, ancillary data, explanatory detail, and caveats, its meaning becomes corrupted. In this way, instead of imparting understanding and knowledge, statistics serve as mathematical bling to trick out a marketing message.

Unless you are skeptical, you’ll miss the flip side of these percentages, which conveys a different reality. Based on the percentages from my original query, anywhere between 10% to 40% of content is used by sales. Marketing teams create content for many different purposes, and looked at this way, the statistic (whichever one you choose to believe – 60%, 70, 80, or 90) seems less dire. Further, content creation is innovation, and when compared to another measure of innovation efficiency, for example, the 85% failure rate of new consumer product introduction, these numbers become less alarming. Relief! Perhaps you can wait another year before hiring an intern to spiff up your company’s content.

Guidelines for the aspiring healthy skeptic. James Loewen, author of Lies My Teacher Told Me, recommends questions for vetting history textbooks. I’ve adapted his points for marketing and sales:

  1. Why was the study conducted?
  2. Why were the measurements chosen? Which ones weren’t – and why?
  3. In presenting the findings, whose viewpoint is reflected – and whose was omitted?
  4. Do the points of the study/article cohere? Are they logical? Are they believable? What explains the anomalies?
  5. Are the findings corroborated elsewhere?

Many biz-dev articles I read decompose rapidly when tested on numbers 4 and 5. If the “Top 3 success traits in a sales person are [X], [Y], and [Z],” what explains salespeople who are successful despite having a completely different trio of characteristics? And I’ll wager that another study of customer interactions could be conducted using a different sample, yielding a substantially different result from “92% of them happen on the phone.” All too often, those sharing such information are happy to reply to accolades posted on their articles, but don’t respond when pressed for more detail or clarification. I’m assuming they’d rather not be bothered.

According to Andy Kessler, “Many of the studies quoted in newspaper articles and pop-psychology books are one-offs anyway. In August 2015, the Center for Open Science published a study in which 270 researchers spent four years trying to reproduce 100 leading psychology experiments. They successfully replicated only 39 . . . Add to this a Nature survey of 1,576 scientists published last year. ‘More than 70% of researchers have tried and failed to reproduce another scientist’s experiments,” the survey report concludes. ‘And more than half have failed to reproduce their own experiments.’” If we chose to be similarly introspective in marketing and sales, our performance would likely not fare any better.

I’m under no delusions that my squeaky complaining will slow the tsunami of statistical malpractice. But if asking these pointed questions causes anyone to pause before hitting the Retweet button, or to hesitate before chiming “spot on!” following a study gratuitously calling itself authoritative, then mission accomplished.

Daniel Levitin, author of A Field Guide to Lies, provides a counterpoint to hyped statistics, one that underscores that the burden of proof must always be on the person making the claim:

“Statistics, because they are numbers, appear to us to be cold, hard facts. It seems that they represent facts given to us by nature and it’s just a matter of finding them. But it’s important to remember that people gather statistics. People choose what to count, how to go about counting, which of the resulting numbers they will share with us, and which words they will use to describe and interpret those numbers. Statistics are not facts. They are interpretations. And your interpretation may be just as good as, or better than, that of the person reporting them to you.”

What makes statistical malpractice insidious isn’t that percentages are purposefully shocking. It’s that the numbers are actually fairly ordinary, and pander to our biases. Everyone has experienced a salesperson who is slovenly or unmotivated. It’s not a stretch to believe a “finding” that 40% of them give up a customer pursuit after perfunctorily following up. Effective time management plagues nearly everyone. Who would be astonished to learn that 50% of sales time is wasted on unproductive prospecting? This is the “secret sauce” in statistical persuasion: find a bias, and harden it with a number. Never mind that terminology like give up, struggle, and unproductive are too fuzzy to have meaning in an experimental sense.

Grainy images, be damned. No matter what, people still really want to see Bigfoot.

Print Friendly, PDF & Email

Recent Editor's Picks:

Categories: BlogCustomer Analytics


No responses yet, why not leave yours?

Add Your Comment (All comments are reviewed by moderator, no spam permitted!)