The Case for Collaboration as a Key Part of an Analytics Center of Excellence

0
57

Share on LinkedIn

There’s more to creating an effective Center of Excellence in Analytics than creating an organizational placeholder. In this series, I’ve been walking through a set of guiding principles that we’ve found make the difference between adding another box on an org chart and having a transformative analytics practice. Today’s topic, collaboration, is a difficult one. Unlike my last post on why it’s a mistake to separate analytics and reporting in your organization (and it’s worth checking out Steve Robinson’s thoughtful commentary), collaboration isn’t exactly controversial. What’s difficult is that collaboration is a little too close to Mom and apple pie. We all support teamwork. Check. No need to think more about it and no need to read what is likely to be nothing but the traditional set of consulting platitudes. There are topics on which it’s hard to say anything very interesting and I’m afraid collaboration is one of them. I’ll do my best and, I promise, no platitudes!

Just as a reminder, here are the seven guiding principles for building an Analytics CoE that I’ve been working through:

  1. Business driven not exploratory analytics
  2. Investing in analytics as a portfolio
  3. Agile methods for analytics
  4. Creating operational strategies for every analysis
  5. Coupling analytics and reporting to drive data democratization
  6. Making collaboration a core capability not an add-on
  7. Creating virtuous cycles inside analytics

My main purpose in writing this post is to describe a concrete set of investments in collaboration that I think should be made as part of a CoE. Before I go there, however, I think it’s worth a short argument for the deep importance of collaboration in analytics.

Collaboration around analytics happens to have a significantly higher ROI than in many other activities and the reasons for that higher ROI are embedded in the very nature of analytics.

Many analytics problems are shared across organizational dividing lines that work well in other contexts. In Life Sciences, for example, it’s traditional to organize along brand. Brand-based structures keep the analytics and reporting close to the business – which is undeniably a good thing. But understanding and optimizing a PPC or Display program will demand nearly identical analysis techniques across every brand. So unless you find ways to share the learnings across brands, you’re wasting a huge amount of analytics effort as analysts for one brand re-learn the lessons already paid for by another brand.

Perhaps it’s my experience in digital that makes me see this deep demand for sharing analysis across different areas. One client I work with a lot happens to have a ridiculously diverse set of businesses that include everything from resort properties to engines to the kitchen sink. But despite having wildly diverse businesses, many of the digital analytics techniques we use are appropriate across these very different properties. Sure, there are plenty of differences too, but I’d guess that more than half of the digital analysis we do is common across properties. That degree of overlap represents a significant opportunity for collaboration. Sharing of methods and learnings provides much deeper leverage across analytics teams and greatly enhances the organizational investment in measurement.

I’d also argue that analytics is a craft discipline. What I mean by that is that it’s primarily learned by doing, and that the doing should be guided. I started (work) life as a programmer, and when I was starting a program – particularly in a language or system I didn’t know – the single most helpful resource I could find was an existing piece of code with somewhat similar function. If you’ve ever tried to use the Omniture API, for example, you know that it’s much easier to steal one of the code samples and start with that rather than trying to hack through the documentation to figure out the connection protocols and the basics of coding and executing calls. And I mean MUCH easier.

It’s the same with analytics. If you’re tasked with analyzing a Website, the single best starting point is another Website analysis.

The craft nature of analytics is one of the key reasons why collaborative mechanisms are good investment when it comes to an analytics CoE. It’s also, by the way, greatly facilitated by the Agile approach to analytics that I suggested previously (wherein I somehow neglected to mention this aspect).

I also think that good analytics products have a much longer tail on the business side of the organization than is generally credited. Much of the collaboration I’ve talked about so far is focused on the analytics community within an organization. But collaborative mechanisms often help create institutional memory for analysis that can be effectively used by the business.

You can see the impact of this in world-class organizations. Some of the best analyticss companies I know of have invested heavily in collaborative mechanisms for analytics – creating their own internal system for sharing, annotating and commenting on analysis. While the analytics community is usually the driver behind this collaborative exercise, the stakeholder community shares many of the benefits. Such systems make it possible to easily find analytics around specific issues and to view the history of their use and interpretation.

Some of our clients have used an internal Wiki to similar effect. Every analytics product is placed in the Wiki – creating not only a centralized repository for the team, but a fantastic resource for the business. The Wiki also functions as version control system – a benefit of no small importance as anyone who has ever hunted through their email trying to find the “final” version of an analysis can attest. This is a really low-cost way to build collaboration.

Okay, that’s the “peace, love and harmony” portion of my argument. Collaboration is good.

Unfortunately, collaboration suffers from a “tragedy of the commons” like syndrome. Enterprises, just like governments, tend to under-invest in public goods. Because the effects of pollution aren’t owned by anybody, the general mechanisms for balancing benefit and cost go out of whack. In the enterprise, stuff that doesn’t generate attributable ROI often goes unfunded. Collaboration benefits lots of people, but it’s hard to pinpoint the benefits and even harder to credit them appropriately.

So given a choice, enterprise’s tend to invest in the tangibles. Between investing in analysis projects or collaboration, they’ll usually choose projects. Ditto between collaboration tools and analytics tools and collaboration resources vs. analytics resources. At very small levels of investment, such choices are appropriate. When you only have one herder, you don’t need to worry about the commons! But as you scale, the chances grow that collaboration will be underfunded.

So how do you invest in collaboration?

Here are some ideas we’ve seen client’s try and succeed with.

The Analytics Repository: Whether it’s a Wiki, a community, or a tool, having a centralized repository for analytics is one of the keys to effective collaboration. A really great repository brings together the ability to search, view, comment, and even drill-down into analytics projects. Some collaboration tools provide mechanisms for exploring analytics findings in Tableau, for example. So you can read about an analysis, see the highlights, and then actually drill into the data. How cool is that? No matter what tool you choose to build the repository, however, process is the key to making it work. You have to make sure that every analysis is stored, that versions are updated, and that appropriate tagging is applied. Once a community is active, this tend to be easy, but in the earlier stages, you’ll need to really work to create and enforce the processes necessary to achieve the critical mass necessary to community health.

Agile Teams: I’ve already described the significant benefits to the agile approach in terms of delivering analytics, but it’s also a clear path to improved collaboration. This is also a great way to start sharing techniques across largely siloed disciplines. If you want your digital analysts to start doing more sophisticated analysis, one of the best methods is to create teams combining traditional customer analysts and digital analysts. Agile teams also create a natural mechanism for collaboration between consultant and client – driving knowledge transfer and greatly improving the ROI to the client.

Meta-Data Repositories: A meta-data repository is a data set created around the techniques you’ve used in digital. Ideally, it captures interesting information about the tactics you’ve tried and their actual performance. Most commonly, you’d create a meta-data repository around campaign techniques. But the technique is almost infinitely extensible. A repository might include information from your testing program, your social media program, or your content creation efforts. The key is to create interesting meta data. Having a campaign name and information about performance isn’t enough. For Display campaigns, you’d want to know the formats purchased, the type of inventory, the network used, the targeting techniques, the target audience, and perhaps meta-data descriptions of the creative (product-focused, call-to-action, etc.). With a meta-data repository, a marketer responsible for planning a campaign can quickly search for any Display campaign targeted toward a similar audience and then quickly slice and dice performance by format and targeting method. It’s fairly easy to create some form of analytics repository, but a meta-data repository takes a little more work. You really need to pay attention to the meta-data you collect and make sure you have processes in place to collect the data consistently and accurately. Once you’ve done that, a visualization tool like Tableau or Spotfire makes a great access point for this (though you can also use traditional Web analytics methods like SAINT). This past Friday I was walking a client through a demonstration of a meta-data repository that tracks theatrical releases for a major studio across the past twenty years using Tableau and it struck me anew just how valuable this type of asset is. It’s a great foundation for deep analytics, a powerful collaboration tool, and a wonderful source of institutional memory all wrapped up in a single, fairly inexpensive project.

Community: A large part of effective collaboration is nothing more than creating good habits around communication. I’ve often talked about the importance of virtuous cycles around analytics, and much of that is just creating cycles of communication. One of my enduring frustrations in Voice of Customer analytics, for example, is that consumers of survey data almost never have much input into subsequent rounds of survey research. What a waste. In Voice of Customer research, it’s vitally important to have a tool that allows you to easily and cost-effectively change your survey instrument. It’s even more important to have a process that can take advantage of that by pushing survey information out to stakeholders and creating collaborative communications about how to drive the research deeper. Many of our best clients have analytics communities and work hard to bring content to them – both internal and external. I regularly do these intra-company webinars for clients on topics like multi-channel analytics, Voice-of-customer research, social media, and big data. They’re a great, almost zero-cost, way to bring cutting edge analytics into the organization and push the best of your analytics thinking across the organization.

 

If you look at world-class analytics organizations, I think you’ll be consistently impressed with how much they invest in collaboration – especially compared to their less successful peers. That investment takes many forms, from technology to process to structure. There’s that old saw about “the best things in life are free.” Collaboration isn’t quite free, but even with an investment in resources and collaboration technologies, it’s very inexpensive when compared to many of other analytics investments. But free and easy are most definitely not synonyms. Like any other valuable commodity, it takes work to work well together.

Republished with author's permission from original post.

Gary Angel
Gary is the CEO of Digital Mortar. DM is the leading platform for in-store customer journey analytics. It provides near real-time reporting and analysis of how stores performed including full in-store funnel analysis, segmented customer journey analysis, staff evaluation and optimization, and compliance reporting. Prior to founding Digital Mortar, Gary led Ernst & Young's Digital Analytics practice. His previous company, Semphonic, was acquired by EY in 2013.

ADD YOUR COMMENT

Please use comments to add value to the discussion. Maximum one link to an educational blog post or article. We will NOT PUBLISH brief comments like "good post," comments that mainly promote links, or comments with links to companies, products, or services.

Please enter your comment!
Please enter your name here