Virtuous Cycles

0
57

Share on LinkedIn

Getting it Wrong and Proud of It

A virtuous cycle occurs when your organization analyzes, tests, learns, operationalizes and feeds back into the analytics. It’s a virtuous cycle because the input from each step makes the next step better and because the whole process loops – with interpretation of data driving tests and the outcome of tests sharpening and enhancing the next round of analytics. Like collaboration, virtuous cycles are one of those “mom and apple pie” concepts that everyone is in favor of but very few people seem to get right. Unlike collaboration, this failure isn’t primarily a matter of under-investment.  It’s more that virtuous cycles require lots of moving pieces all of which have to work well. That’s hard to achieve in any endeavor. In this series, I’m covering a set of guiding principles to building a truly effective analytics center of excellence and I put virtuous cycles last because many of the things you need to do create virtuous cycles are embodied in the first six principles of the list:

  1. Business driven not exploratory analytics
  2. Investing in analytics as a portfolio
  3. Agile methods for analytics
  4. Creating operational strategies for every analysis
  5. Coupling analytics and reporting to drive data democratization
  6. Making collaboration a core capability not an add-on

At the most basic level, before you can create virtuous cycles around analytics, you have to be able to consistently create valuable analysis. One great analysis doesn’t create or sustain a cycle of improvement. The analytics function has to be able to consistently drive value in an ongoing process; otherwise, the business can’t create the cadence and internal processes necessary to act on analysis. This need for consistency is one of the many reasons why exploratory analytics won’t work. Only within the context of focused research and consistent methodologies will the analytics function be able to generate consistent enough value to drive a virtuous cycle. In next week’s post, I’m going to talk about the key methodologies we use in digital analytics to drive virtuous cycles and explain why they works so well in iterative environments.

Using an agile team-based approach to analytics is another essential ingredient in achieving the consistency necessary to drive continuous improvement. Attrition happens. So does promotion. When your rockstar analyst becomes a manager, will you have replacements ready? By driving analytics with small, close-functioning teams, you create natural mechanisms for knowledge transfer and mentoring. Over the long haul, those mechanisms are critical to having the resources necessary to deliver value consistently. Small agile teams also create a natural and protected institutional memory of the last analytics cycle around a problem. Having team members already familiar with the data, the key business problems, and the last set of recommendations, makes driving a new analysis cycle vastly easier.

Producing good analysis is just the first step in creating a virtuous cycle though. Once you’ve got the intelligence, it has to be effectively communicated and, of course, someone has to act on it. That’s where collaboration and operationalization come in.

As I’ve enumerated these principles, I’ve tried to call out the interesting and deep ways they interact and support each other; that’s true when it comes to the relationship between operationalizing analytics and creating virtuous cycles. Unless you take action, you can’t complete the cycle. But it’s also true that virtuous cycles make operationalizing analytics much easier. Our best clients have established a regular cadence of analysis, socialization, and testing. When site owners expect to drive site change on a regular analysis driven cycle, the resources and focus to take action are consistently in place and the effort required to operationalize an analysis is dramatically reduced. Part of what makes a client who’s established that analytics cadence so great is that you know the chances of your analytic recommendations being acted on are dramatically higher.

In fact, the single most important part of creating virtuous cycles is cadence. You have to commit to doing analysis over. And over. And over. You never stop. Finding the right cadence isn’t simple and there is no one right time between analysis. Every business and every property has a different dynamic in terms of how often it changes, how much it can test, and how valuable is the resulting improvement. The right analytics cadence for a Website might be a month, a quarter or even a year. The duration is less important than the fact of regularity.

Testing is another critically important component of operationalizing analytics. Analysis rarely produces complete answers. Usually, it identifies problems or opportunities. How you solve those problems or try to recover those opportunities will have a huge impact on whether or not you’re successful. Nor should it simply be assumed that analysis is always right. A testing capability provides the natural operational mechanism to flesh out analytics and make it real.

The tight relationship between analytics and testing makes it particularly appalling when organizations split these two functions and make them almost completely independent. All that stuff I wrote about reporting and analytics? It applies even more to analytics and testing. While splitting reporting and analytics may not be a great idea, it’s easy to understand why many people think it might be. But splitting analytics and testing is just stupid.

Speaking of which, don’t forget the virtuous cycles creating by coupling analytics and reporting. As I pointed out previously, one of the truly unique benefits to this approach is that when a model embedded in a tool is used by the business and it doesn’t create an accurate forecast, there’s a natural drive for the business to want to improve the model. As I know from days in direct response, models improve via use and tuning. You’re never finished learning, and by embedding models into the core of your data democratization effort, you create an almost magical feedback cycle.

Consistent analytics. Cadence. Testing. Communication. These are the fundamental building blocks on which cycles of continuous improvement are built. But these aren’t achieved by mandate. If you can’t generate consistent analysis, cadence is pointless and so is testing. The principles I’ve outlined in this series work together to create the web of processes necessary to create, support and sustain these building blocks. Unfortunately, you can get the first six principles right and still might miss out on all the benefits of virtuous cycles. The extra ingredient is wrapping up all that good work in a process that provides a consistent cadence to every aspect of the analytics process – from insight to test to operations. When you’ve got all those pieces in place, you’ll truly have built something special. You’ll have an analytics capability that drives consistent, ongoing improvement in the business. That’s what creating a great analytics capability is, ultimately, all about.

Republished with author's permission from original post.

Gary Angel
Gary is the CEO of Digital Mortar. DM is the leading platform for in-store customer journey analytics. It provides near real-time reporting and analysis of how stores performed including full in-store funnel analysis, segmented customer journey analysis, staff evaluation and optimization, and compliance reporting. Prior to founding Digital Mortar, Gary led Ernst & Young's Digital Analytics practice. His previous company, Semphonic, was acquired by EY in 2013.

ADD YOUR COMMENT

Please use comments to add value to the discussion. Maximum one link to an educational blog post or article. We will NOT PUBLISH brief comments like "good post," comments that mainly promote links, or comments with links to companies, products, or services.

Please enter your comment!
Please enter your name here