Thinking Agile : Thoughts from X Change


Share on LinkedIn

Agile development methodologies have become the dominant paradigm for enterprise digital IT development. So it was only a mild surprise that an X Change Huddle on Agile Analytics was packed to the gills. What wasn’t surprising was that nearly everyone there was deeply involved in the problems and opportunities that Agile presents.

If you’re not familiar with IT methodologies and why they are important to digital analytics (count your blessing), here’s a quick primer before I dive into the meat of the discussion and some of my biggest take-aways.

Agile methodologies were a response to the documentation, process-heavy development methodologies that dominated enterprise development in the 90’s. The goal of Agile methodologies is to create smaller teams (often called Scrums) focused on very short time-horizon development projects. Because of this, Agile methods deliver faster cycle times (with more incremental releases) and release cycles that are highly tuned to business needs. A typical Scrum team includes a Product Owner, the development team and a Scrum Master (every methodology needs it’s silly titles) who is really just a project manager trained in the intricacies of Agile. Development projects start with user-stories written by the product owner and that are then encapsulated in Sprints – short release cycles with specific, agreed upon goals.

It sounds pretty good, right?

And not only does it work pretty well in general, it seems exceptionally well suited to digital development where incremental improvements and short cycle times seem to fit the bill perfectly.

So what’s the problem?

Well, when it comes to analytics, Agile presents both problems and opportunities. Let’s start with the bad.

Analytics traditionally hasn’t been done with Agile. At nearly every enterprise, analytics starts with a large requirements and standards definition process. This is nearly always across site(s) and uses what looks like a traditional waterfall approach (conception, analysis, design, implementation, testing, and maintenance).

So the first and biggest question for analytics in an Agile environment is whether/how you should adapt your tagging process to fit an Agile methodology. For an existing site or a comprehensive roll-out of a new tool, you’ll almost certainly still use a waterfall approach to tagging. After that, however, it’s unlikely you’ll be able to continue that way for long. With Agile teams popping up, releasing new features, and then disappearing, it’s hard to see how a traditional waterfall approach to tagging will ever work.

This doesn’t mean your standards all go out the window, but they’ll no longer look much like static documents. It does mean that you’ll need to be constantly in a measurement requirements (and design, and implementation and testing) cycle. With Agile, all of those steps are likely to be going on constantly with different Scrums.

Here’s another challenge – and it’s a doozy. When I meet with prospective clients about implementations, one of the first things I ask is whether the developers have ever implemented tags before and whether the development team is stable. When the answers are “no” and “no”, the price goes up. Sometimes way up. If you’ve ever tried to support tagging you know why that’s the case. It makes a HUGE difference in the ease of rolling out tags. But having stable dedicated resources to tagging works a lot better with large waterfall-like processes. With Agile, teams are smaller, change often, and reform quickly. So unless you have a lot of tag-savvy developers, you’ll constantly be pushed to re-train your teams.

Which brings me to another challenge to Agile – how analytics requirements get attached to the user-stories. Should the measurement requirements be a separate set of user-stories – or are they meta-data on top of the existing user-stories? And what happens if a development project just is about measurement. I’m assuming that in that last case the user-stories are measurement stories – but in most cases it feels more natural to have the measurement requirements as a form of meta-data about the user-stories.

If you take the user-story approach, one of the dangers that was discussed at X Change was that measurement user-stories often get dropped in the Sprint prioritizations. Unsurprisingly, I’m no fan of that. On the other hand, if you don’t create user-stories for measurement, you’ll have no natural way to take true advantage of the Agile methodology within analytics (more on this in just a bit).

My quasi-preferred direction would be to make all required elements of your measurement standard as non-optional meta-data on the user-stories. If you’re implementing something, it must have that level of measurement. However, all measurement beyond the required standard would be user-story based and could be prioritized in or out of Sprints.

No matter which approach you take, it seems clear that you’ll need to attach measurement resources to every Scrum. Some of the most advanced shops targeted a 1 measurement resource for every 2 Scrums ratio.

Given the exigencies of the Agile environment with small rapidly evolving teams and multiple incremental projects running simultaneously, it seems like the perfect setup for a Tag Management System. With a TMS, you have the ability to drive measurement outside of the Agile process – and, at the very least, insulate yourself from the challenges of dealing with short-lived development teams.

Unfortunately there’s a hidden and pretty serious problem with relying on a TMS in an Agile environment. The real challenge in any tagging implementation revolves around page customization. Dropping the base tag onto a page is almost never an issue. When you do page customizations with a TMS, you’re typically extracting values from the URL or DOM for your customizations. In an Agile environment, the development teams typically won’t have any idea what measurement you’ve put on the page with a TMS. That means that if they make a change to the DOM, they may be inadvertently erasing measurement. In a waterfall approach, you have far less chance of this happening because both development and measurement teams are working to an identical defined set of requirements. But in an Agile environment, there is no bulky parallel process to rely on.

In other words, it doesn’t seem that a TMS will reduce the need for measurement team presence within Agile teams since the chance of inadvertently wrecking page customizations will be ever present. It might even make the application of measurement worse – or at least riskier since it will be so easy for uncoordinated Agile development teams to damage the measurement without realizing they’ve done so.

Most of the discussion in our Huddle focused on the challenges of adapting analytics to Agile. But the story isn’t all about methodological difficulties. There are opportunities to re-thing the measurement requirements process to take advantage of Agile. After all, one of the worst aspects of tagging is that it forces the organization to pre-commit to measurement. With Agile cycles, however, there’s an opportunity to deliver measurement in incremental batches. You can start with basic measurement and, as you explore the data, create Agile user-stories around measurement that allow you to extend and enhance the depth of your measurement.

This is a fundamentally different approach to measurement implementation and it has obvious attractions to any measurement team that is stuck without a TMS in a waterfall environment. Because in the traditional measurement environment, if you don’t have all the measurement you need you’re stuck with a very long development cycle before you have a chance to fix or enhance anything.

So when it comes to measurement, there’s some new opportunity and some real challenges when it comes to Agile.

I don’t think there’s a place other than X Change where a topic like this can get it’s due. Agile presents profound challenges to Web analytics professionals. But it’s neither sexy nor easily digested. It’s poor fodder for conference style presentations with little opportunity bragging rights. A great huddle and a uniquely X Change-like experience.

And speaking of Conferences, I’m on my way out to eMetrics in Boston! If you’re coming out, look me up – and check out the extensive Semphonic team there. Paul Legutko is speaking on Monday in the Advanced Analytics track on Scorecarding from Verbatims. Phil Kemelor is part of a panel “Analytics Adoption in Real-Life (with a bevy of Semphonic clients including Intel, KPMG and the Nature Conservancy). Greg Dowling is speaking and leading the entire Mobile Measurement track. I’m part of a Keynote panel on Social – “Getting Down to Business”. A pretty good demonstration of the breadth of practice here and a great opportunity to meet some of the team.

Republished with author's permission from original post.

Gary Angel
Gary is the CEO of Digital Mortar. DM is the leading platform for in-store customer journey analytics. It provides near real-time reporting and analysis of how stores performed including full in-store funnel analysis, segmented customer journey analysis, staff evaluation and optimization, and compliance reporting. Prior to founding Digital Mortar, Gary led Ernst & Young's Digital Analytics practice. His previous company, Semphonic, was acquired by EY in 2013.


Please use comments to add value to the discussion. Maximum one link to an educational blog post or article. We will NOT PUBLISH brief comments like "good post," comments that mainly promote links, or comments with links to companies, products, or services.

Please enter your comment!
Please enter your name here