Coupling Analytics and Reporting

0
43

Share on LinkedIn

The way an enterprise is organized says more about what it cares about than all the mission statements in the world. Organization mirrors power and reflects priorities. With the dramatic growth in analytics as an organizational priority, we’re seeing the inevitable and corresponding growth of analytics in the organization. From the increasing recognition of the need for a Chief Analytics Officer (CAO) to the widespread push for enterprise Analytics Centers of Excellence (CoE), the shift toward analytics as a legitimate priority in the enterprise is obvious. But there’s more to creating an effective Center of Excellence in Analytics than creating an organizational placeholder. In this series, I’ve been walking through a set of guiding principles that we’ve found make the difference between having another box on an org chart and having a transformative analytics practice:

  1. Business driven not exploratory analytics
  2. Investing in analytics as a portfolio
  3. Agile methods for analytics
  4. Creating operational strategies for every analysis
  5. Coupling analytics and reporting to drive data democratization
  6. Making collaboration a core capability not an add-on
  7. Creating virtuous cycles inside analytics

Today, I’m going to dive into the relationship between analytics and reporting (#5). I’m going to argue that the prevailing wisdom about reporting and analytics is wrong and that the most common emergent organizational structure around analytics and reporting is backwards.

In the analytics maturity curve that I show, I draw the single largest gap between reporting and analysis.

Analtyics Maturity Curve Full

The reason for that large gap is simple. Enterprises tend to get stuck doing reporting and often find that their resources are tied up building and running reports (both ad hoc and regular). All that ad hoc reporting does drive value; it’s important for people to understand how the digital systems behave and reporting helps create that understanding. But there’s a widespread and entirely correct consensus that most of that reporting doesn’t actually have much impact on how well the business works. “We have data, but we don’t use data!” is the litany I hear, endlessly repeated.

Reporting is both prior and necessary to analytics. You can’t just leap to analytics. People have to know enough about the digital system to ask the right questions. They have to have enough vocabulary and business understanding to absorb the results of analysis. Reporting creates that foundation.

On the other hand, reporting chews up analyst free cycles and is notorious for expanding to the point where analysts simply don’t have time to do analysis.

So it’s no surprise that when it comes time to build an Analytics capability, one of the first and biggest organizational priorities is to separate reporting and analytics. The thinking behind this separation is pretty obvious. Reports aren’t analytics. Reporting requires a different (lesser) skill set. Reporting distracts from analytics and has been the single biggest hindrance to getting analytics done. Splitting the two functions just seems natural.

 Despite this seemingly natural logic, splitting reporting and analytics is a bad idea.

We all know that there’s a terrible shortage of experienced and capable analysts. Even as we began to see an easing in the shortages around classic Web analysts, the state-of-the-art moved into much more advanced and demanding data scientists – folks with the ability to manipulate data, generate algorithmic analysis, and do advanced statistical work as well. People with these skills do exist, but with the competition for them so high, they are very difficult to hire and even harder to retain. What’s more, your data science efforts had better be well-funded, your corporate brand impeccable, your culture highly sympathetic, and you’d best be located in Silicon Valley or New York.

In other words, hiring a complete team is hard and for most enterprises, impossible. So you’re going to have to grow your analytics talent. That usually means hiring smart people with no experience. And when you do that, you’re going to want to be able to give them tasks that are manageable and that will gradually increase their understanding of the data and the potential of analysis. In our practice, that’s traditionally meant starting out junior hires on reporting.

The expectation, however, is clear. Nobody wants to spend their career as a report monkey. Even our Indian offshore team complained because all we were giving them was reporting and manual automation tasks. They wanted meatier projects with real analysis! To build an effective CoE, you need a natural path between reporting and analytics. I don’t think that’s made easier by splitting the two functions.

There’s a deeper reason why separation is a bad idea, though, and it has nothing to do with organization.

In my experience, reporting is far harder than analytics. Not every analysis project is a success – we’ve had a few failures and a decent number of so-so results – but most of our analysis projects deliver value.  I just can’t say the same for reporting projects. I’d be inclined to think we just suck at reporting, but in fact the same could be said of pretty much ALL the reporting I see. The overwhelming majority of digital reporting efforts are failures (even where the stakeholders might not agree). To me, a successful reporting project is one that deepens the organizational understanding of the system and improves decision-making. The digital analytics reporting I see achieves neither of these goals, and I could drop the words “digital analytics” from this sentence and still say the same.

There are a couple of big (and related) problems with enterprise reporting:

  1. Focus on site-wide, actionable KPIs
  2. Focus on the current-state

I’ve talked many times (and have some great presentations if you’re interested) on why there is no such thing as an actionable KPI and how the focus on a small number of high-level KPIs has produced reports that are beautiful but worse than meaningless.

KPIs fail for two reasons: noise and context. Noise is simple to understand. Sites typically have many different functions and many different audiences. You aren’t going to sell product to a visitor who’s come to the site for support. I tell our clients that before you can meaningfully interpret a metric, you need to know who you’re talking about and what they are trying to accomplish. Is traffic on your site up? Meaningless. Until I know whether that traffic is customers or prospects, buyers or support-seekers, I don’t know anything. Every metric should be placed in the context of a 2-Tiered segmentation that defines who the metric is about and what they were trying to accomplish.

This leads us to build high-level reports like this in which segmentation is the critical component:

 Flow Process Step 1

Lack of context is the second great issue with KPIs. Suppose you’re conversion rate increased. Good news? Maybe, if it means you’re site efficiency improved. But what if conversion rate is up because your SEO just took a big hit and you lost a significant percentage of your less-qualified buyers; still good news? There are an infinity of reasons why conversion rate might change. Without context, interpreting the change is impossible and the initial, intuitive understanding of the change is highly likely to be mistaken.

What’s true for conversion rate is true for every popular KPI. By simplifying our reports into a small number of actionable KPIs, we’ve encouraged decision-makers to leap to conclusions (conversion rate up – that’s good) that simply aren’t justified. Our reports haven’t deepened understanding, they’ve reduced it!

Context is a very hard problem to solve. Reports just aren’t good at capturing the relationship between variables. But it’s in the relationship between variables that all the real business understanding resides.

 

The second huge problem with reporting is the focus on what happened – what I call the current-state. Hey, that’s what reporting is, right?

It shouldn’t be.

I liken most enterprise reporting to listening to a weather report where all they tell you is the current temperature. I can open a window if I want to feel how cold it is right now. Ninety-nine times out of a hundred, what I want to know is what the weather is going to be – later in the day, tomorrow, in Atlanta, or on Super-Bowl Sunday. In other words, I want to know what to expect in the future, not what happened in the past.

I think we can all agree that our reporting does an almost universally terrible job of letting us know what to expect. That’s why enterprise reporting (at best) tends to solve only half the problem – and by far the least interesting half. I can say that knowing what to expect is an analytics problem, but it’s an analytics problem that should be embedded in reporting. After all, it’s not analysts who need to understand what will happen – it’s decision makers.

Here’s where the two issues become deeply related, because to build a model of a system you need to address both a visitor segmentation and the relationship between variables. That’s pretty much what a model is. So building a model solves, inherently, the problems around noise and context that cripple most reporting efforts.

To go beyond current-state, we need to solve the problems of noise and context. Having done that, we can deliver reports that are both explanatory (how variables related to produce the observed change) and predictive (what is likely to happen). I call this analytic reporting and it’s what we’re trying to do with all of our recent enterprise reporting projects.

We’ve built these model embedded reports across a variety of problems now, from digital marketing mix to mobile application impact to product launch effectiveness. The approach will work for ANY system that can be modeled and it can be delivered in a variety of platforms (from Excel to Tableau to custom code).

Analytic reporting is the right way to build enterprise reports. It fundamentally changes the paradigm. By moving from a focus on the current-state to understanding what happened and what will happen, reports become tools to run the business. Tools are much more useful (and used) than reports. By encapsulating a model, these reports help drive real understanding. They put users hands-on with the levers that actually make a difference to business outcomes and let them test different strategies. They also create the foundation for a true alerting system, since the model can be used to identify places where unexpected variation occurs. This makes for a far more intelligent system of alerting or highlighting than is possible using simple measures of variation.

Indeed, I’ve come to believe that this strategy is fundamental to effective data democratization. By leveraging the intelligence of your analysts in the models, it creates a framework for understanding the business that makes the data both intelligible and useable at a very high level. This isn’t about sharing data or making reports accessible. It’s about creating and sharing knowledge. Analytic reporting has, in addition, a unique side benefit. What happens when a forecast goes awry (as forecasts will do)? The business stakeholders go back and ask the analysts to fix the model, driving ever deeper understanding of the business. That’s unique. Analytic reporting creates a tight integration of analysis and data democratization that drives a virtuous cycle of improvement in the analytics program.

If analytic reporting is the key to effective data democratization, then what’s the logic behind splitting reporting and analytics? You’ve just separated the very functions that need to be tightly coupled to make each most effective!

I’ve had to go along way around to establish why it is that splitting analytics and reporting at the enterprise level is a mistake, but it all comes down to a point that I think is fairly straightforward. Data democratization isn’t fundamentally about giving stakeholders access to data. It’s about giving stakeholders access to the intelligence necessary to drive the business. Traditional reporting doesn’t do that, and it won’t, no matter how pretty your reports are and no matter how fancy the tool you deliver them is. You can deliver wonderful visualizations in Tableau without solving either the segmentation or the context problem. Analytic reporting, no matter how it’s delivered, solves both. But to deliver analytic reporting, you need a tight coupling of your analytics and reporting.

This may run counter to the current orthodoxy, but the current orthodoxy is based on a fundamental mistake about the reporting function and its relationship to analytics.

Republished with author's permission from original post.

Gary Angel
Gary is the CEO of Digital Mortar. DM is the leading platform for in-store customer journey analytics. It provides near real-time reporting and analysis of how stores performed including full in-store funnel analysis, segmented customer journey analysis, staff evaluation and optimization, and compliance reporting. Prior to founding Digital Mortar, Gary led Ernst & Young's Digital Analytics practice. His previous company, Semphonic, was acquired by EY in 2013.

ADD YOUR COMMENT

Please use comments to add value to the discussion. Maximum one link to an educational blog post or article. We will NOT PUBLISH brief comments like "good post," comments that mainly promote links, or comments with links to companies, products, or services.

Please enter your comment!
Please enter your name here