Should marketers be able to answer these 5 stats questions?

1
143

Share on LinkedIn

Statistics, Statistics, Statistics

Last week, I gently ribbed the Marketing Leadership Council of the CEB over their HBR post, Marketers Flunk the Big Data Test. I felt the statistics they were sharing about marketers not being very statistically savvy were themselves a little questionable.

In response, one of the authors of that post, Anna Bird, graciously reached out to me to address some of my concerns.

The HBR post stated that when they asked Fortune 1000 marketers five “basic to intermediate” statistics questions, 44% of those marketers got four or more answers wrong. Only 6% of the marketers they interviewed got all five right. The implication was that marketers lacked statistical aptitude — a serious concern in an increasingly analytical and data-driven discipline.

My immediate reaction: what were the questions?

After all, statistics can often be technical and counterintuitive. Reams of academic research have shown that people in general are very bad at reasoning statistically. Were the questions asked a fair instrument to judge marketers’ capabilities in this brave new world?

You can judge for yourself. Anna shared their questions with me and gave me permission to post them here. How many of these can you get right? (Answers at the bottom of this post — no peeking.)

  1. True or False: Variance measure the dispersion of data around a sample average.
  2. True or False: How well the sample reflects the target population is more important than how large the same is.
  3. True or False: Regressions with more variables are always more credible than regressions using fewer variables.
  4. There are two hospitals in town. One is very large; the other is very small. On a particular day, 80% of babies born in one hospital are male. For which hospital is this more likely to be true?
  5. A bat and a ball cost $1.10. The bat costs one dollar more than the ball. How much does the ball cost?

Okay, scroll down and peek now — how’d you do?

If you didn’t ace it, take comfort: there’s a small percentage of people on the planet who could. In fact, that was one of my other objections to the original presentation of the CEB’s findings. Sure, marketers might be bad at statistics. But if almost everyone is bad at statistics, and marketers are no different than everyone else in that regard, is this worth ringing alarms bells over?

“Our goal wasn’t to rate marketers’ data savvy relative to other individuals (since the vast majority of people get that [last] question wrong),” wrote Anna. “But just to find out whether marketing analytics actually poses risks (as well as benefits) due to everyone’s natural propensity for error when interpreting numbers. We think that this risk is under-appreciated.”

So is it possible that marketers, bad at stats as they are, are actually better than most?

“No one is great at stats/analytics, but marketers are actually better than most functions, except finance and strategy. However, marketing’s information needs are also far more complex and ambiguous than most functions — so they really need better data skills. Generally, we believe that most employees — but marketers in particular — are unprepared to take advantage of emerging technology and data. And that a bit more stats training (even just basic training) would help substantially.

All good points, and I appreciate Anna’s willingness to openly engage in this discussion. I largely agree with their core premise. But I still wonder: are these type of statistics questions measuring the right thing? Are there better kinds of questions to evaluate analytical preparedness? What do you think?

Answers:
1. True. 2. True. 3. False. 4. The smaller. 5. 5 cents.

Republished with author's permission from original post.

Scott Brinker
Scott Brinker is the president & CTO of ion interactive, a leading provider of post-click marketing software and services. He writes the Conversion Science column on Search Engine Land and frequently speaks at industry events such as SMX, Pubcon and Search Insider Summit. He chairs the marketing track at the Semantic Technology Conference. He also writes a blog on marketing technology, Chief Marketing Technologist.

1 COMMENT

  1. … without misleading headlines?

    The more I see from the CEB, the less I like it. The organization is very clever at creating research and promoting with eye-catching headlines like this post.

    I’m not going to bother debating the merit of the questions, except to say the last one (#5) is not even about statistics. That bat and ball question is a puzzle that most people get wrong (me included) but it’s not statistics.

    Let’s take the original article, which concludes (via the headline) that marketers “flunk” the Big Data test because they rely too much on gut, can’t do basic statistics, etc. Who decided what the “test” was? CEB. And that marketers flunked? CEB.

    Maybe CEB should spend more time on quality research and less on marketing copy in the form of articles.

    Other examples from CEB, HBR article “Stop Trying to Delight Your Customers” which I rebutted here:
    http://www.customerthink.com/article/keep_delighting_your_customers

    And then there’s the “challenger” sales methododogy, which CEB promoted in another HBR article “The End Of Solutions Sales.” Discussion here: http://partnersinexcellenceblog.com/the-end-of-solutions-sales/

    Why should anyone believe CEB’s research? It’s not peer reviewed and based on the promotional tactics I’m seeing, seems designed to stir up consulting business.

ADD YOUR COMMENT

Please use comments to add value to the discussion. Maximum one link to an educational blog post or article. We will NOT PUBLISH brief comments like "good post," comments that mainly promote links, or comments with links to companies, products, or services.

Please enter your comment!
Please enter your name here