Analyst vs. Implementer – Redux


Share on LinkedIn

I’m going to try for an ambitious double posting this weekend – I’ve been working on my ongoing series on Digital Analytics and Database Marketing Convergence so I have the latest in that series. I also wanted to take the time to respond to Adam Greco’s detailed, thoughtful and interesting “reply to a reply” around some best practices in Omniture implementations.

I’m going to start with the 2nd topic.

Every school produces different students. Just as James Bond might be able tell not just the year and grape but the side of the mountain on which a wine originated (and probably the weather that year), so too do we classify painters, musicians, economists and even businessmen. The school of GE and the school of P&G are both as famous as they are distinct.

A week or so ago, I wrote Analyst vs. Implementer as a commentary on Adam Greco’s blog on “Top Omniture Implementation Pet Peeves.” My thesis in that post was one that I have touched on before; namely that there are two very distinct schools of Omniture implementers; I called these schools Technical Implementers and Analysts. Of course, my thesis didn’t end with the simple naming of these schools. I described technical implementers as being primarily focused on knowledge of Omniture as a system and of being primarily concerned with the cleanliness of an implementation. I would say, further, that the school they are implicitly produced in is Omniture; whether working at Omniture or being trained by Omniture or simply having worked on nothing but Omniture implementations. They tend to have a strongly software-centric view of implementations.

I contrasted the Technical school with the Analytics school. Analyst implementers are primarily concerned with the richness of an Omniture implementation; they are focused on data capture not on cleanliness and their primary concern with an implementation is that it not sacrifice information. They are concerned with what I might describe as sins of omission rather than sins of commision. I would say, further, that the school they are implicitly produced in is outside Omniture, they have worked in multiple systems, and they come to implementation via analysis. They tend to have a much less interest in the software as a system.

At least insofar as any broad generalization can capture reality, I think these “schools” do so. I think they aptly describe significant features of the way people approach a set of real world problems (Omniture implementations). What’s more, I think it’s important that people understand the differences between the two schools and why it might matter. Those in the Technical school are not shy (and why should they be?) about the obvious advantages they entertain. When thinking about an Omniture implementation, it’s surely and advantage to work at Omniture or have worked there, it’s surely and advantage to have a deep technical knowledge of the software, and it’s surely and advantage to know how to produce a clean implementation.

That being so, why should those of us who come from a different school and think it has its own advantages be any less shy of discussing them? I’ve come to believe that there is no more important or better preparation for doing really good Omniture (or other Web Analytics) implementations than doing analysis – especially with Omniture though also outside of it as well. At Semphonic, we don’t have single person on staff who’s ever worked for Omniture. It’s not a policy or anything – it’s just worked out that way. And yet, I believe that our implementations are consistently deeper and more useful than implementations produced by those trained in that school. Why? I think it’s because we train analysts first and implementers second; and I believe SiteCatalyst happens to be the sort of software system that rewards that approach.

In Analyst vs. Implementer, I described my “two schools” thesis and used Adam’s post as an example. It wasn’t so much that I disagreed with Adam’s pet peeves (which were obviously sound advice), as that I found them representative of a set of concerns that I find secondary. In particular, I highlighted one issue (which happened to be his #1 pet peeve) that I thought was not just secondary but, at least in its full description, distinctly misguided. That issue was the duplication of eVars as sProps.

So, in effect, my post contained one large argument (that two schools around Omniture implementations exist, that the concerns of those schools are different, and that the Analyst school is ultimately focused on the most important set of problems, and that this difference was well illustrated in the types of things that made Adam’s list and, in particular, in his #1 pet peeve around eVar duplication). Inevitably, much of the focus of this immediately devolved into a Twitter argument about sProps and eVars; which, if you think about it, is rather ironic and totally representative of the Technical school.

In his reply, however, Adam addressed both sides of the argument. He starts with a discussion of the broader thread and then dives down into a technical discussion of the sProp/eVar question.

Now it’s pretty clear Adam resented getting lumped into the “Technical” school, and I get that – particularly since I was picking fault. Composers hate getting labeled as neo-romantic just as writers hate getting labeled post-modern. The best practitioners tend to be the most resentful of classification and Adam is, to my mind, the best practitioner in the Technical school (sorry – I just think it fits) and always has been – it’s what makes him so effective. He has regularly shown (and shows in his reply – read his Listprop discussion for example) a fairly remarkable interest in enabling and using analytic techniques. To my mind, there aren’t other representatives of that school who are remotely his equal. So it was probably unfair on my part to choose one of his posts to illustrate my thesis. Truly, my apologies!

On the other hand, I think “Pet Peeves” is far from his finest work – and was much more representative of the “school” than his frequent ability to transcend it. Not a single one of his pet-peeves would have made my list had I tackled a comparable topic. Not one. They are, without exception, technical sins. The sort of sins, if I may borrow another Catholic analogy, for which I would need but one bead on a rosary to atone for. On issue after issue, the big problems with Omniture implementations (failure to capture the necessary information or turn on the right capabilities to use it) seemed to me missing and, in the case of the eVar/sProp discussion, treated rather poorly.

Nevertheless, after digesting my take, Adam came back with a very detailed and carefully considered reply.

I’ll summarize it (as best I can) as two separate threads.

In the first thread, Adam rejects my thesis that there is something significant about the differences in what the two of us would likely call-out as pet peeves – mine being almost all sins of omission (what’s missing) and his being overwhelmingly sins of commission (what’s there). On the other hand, I think Adam offers more of a disgreement with my thesis than an argument against it. It’s just not clear to me which of the top implementation peeves listed might be taken as representative of a deep concern for the analytic potential of an implementation. Nor does Adam really take up the essence of the argument except from a purely personal perspective. Is it really unhelful to create this type of classification? I don’t think so. Not only is it common practice in every discipline, but an interesting classification does real intellectual work. What’s more, it’s a classification that Omniture Professional Services (and every other Web Analytics vendor) and every spin-off of Omniture PSC uses with regularity – only spun as an advantage not a disadvantage.

I believe that not only is the difference real, but that it’s far more important than our differences at the technical level over sProps. I hope it’s not foolishily immodest to say flatly that I think Adam and I are both really good at this stuff – both experts in the field. I think it’s surprising and interesting that we don’t share similar implementation pet peeves. To me, that reflects more than technical arguments over sProps and I’m sticking by my thesis until someone suggests a better one to explain the difference.

So that’s thread #1.

In thread #2, we get into the true technical nitty-gritty and Adam offers a whole array of reasons why duplication is a poor-practice. Now, I once had a Professor of Philosophy who told me that if someone gives you ten arguments for something, it’s probably because none of them are good enough to stand on their own!

However, before I fault Adam for that, I have to admit my original post relied on an identical laundry list of why duplication might be appropriate. So either Adam and I are both offering a series of weak arguments or else this is simply the kind of question where you have to think across a list of practical pros-and-cons and decide where you stand. I’m going option #2, but I do think that some of Adam’s laundry list of cons feel a bit post hoc – reasons manufactured to justify an argument not reasons that you would have started out with if you had no cause to defend.

I’ll classify Adam’s arguments in four categories: uncommonness (there really aren’t that many cases to duplicate and get benefit), hidden costs (latency, $, pageload, variable preservation), and UI (users get confused).

Let’s start with uncommonness. I’ve shortened Adam’s post but I don’t think I’ve done damage to any of them:

1. Using List sProps –…I maintain that the use of List sProps was justifiably covered in my statement of other sProp uses that are “few and far between.” I don’t use List sProps very often because I feel that there are better ways to achieve the same goals. …List sProps have severe limitations and there is a reason that they are rarely used (maybe 2% of the implementations I have seen use them). I have found that you can achieve almost any goal you want to use List sProps for by re-using the Products variable and its multi-value capabilities instead. By using the Products variable, you can associate list items to KPI’s (Success Events) rather than just Traffic metrics…

2. Page-Based Containers & Segmentation…[no real argument here]

3. Correlations – With respect to Correlations… I included Correlations in my list! I also mentioned that this justification for using an sProp may go away in SiteCatalyst v15 where all eVars have Full Subrelations. Also, one of the reasons I prefer Subrelations to Correlations is that Correlations only show intersections (Page Views) and do not show any cross-tabulation of KPI’s (Success Events). Personally, I would disagree …about over-doing Correlations, since in my experience, implementing too many Correlations (especially 5-item or 20-item Correlations), with too many unique values, can cost a lot of $$$, lead to corruption and latency.

4. Pathing – In the area of Pathing… on the same page about its importance… Again, I might differ … in that I don’t think enabling Pathing on too many sProps is a good idea since it can cost $$$ and produce report suite latency, which is why I prefer to use Pathing only when it adds value.

I just don’t find any of this at all compelling.

While using the products variable as a substitute for a ListProp is a great solution when you can do it and certainly more representative of the best thinking Adam does, it has some drawbacks. The biggest is that the products variable usually contains product information. Only sites that aren’t doing ANY ecommerce have the luxury of that strategy unless they want to dump a bunch of confusing non-product information into their product reports. Second, it only conveniently works once. Yes, if we have a single list we want to save and we don’t have any eCommerce, we’ll use the product variable. Otherwise, we use Listprops. No doubt this is an edge case, but it comes up more often these days than it used to. It’s an important technique for the capture of modules and internal ad impressions on a site. That it only shows up in 2% of implementations doesn’t mean much – few really good practices in Omniture from an analytics perspective show up much more than that.

I think the correlations discussion is just wrong – the weakest argument in the whole post. Omniture provides most clients with unlimited, free two-way correlations (crosstabs) of sProps – configurable in the interface. I rarely recommend purchasing additional 5 or 20 item correlations and I don’t think anything in my post suggested otherwise ($$$ implication to the contrary). Most Omniture contracts let you fully cross-tabulate (subrelate in Omniture speak) only 3 eVars. After that, you have to pay more. It’s true that Correlation was included in Adam’s original list of reasons why you might duplicate, but if you need lots of correlations and you don’t get them from eVars without paying extra, then doesn’t that mean you should be duplicating eVars as sProps?

So I think Adam misses the point. Two-way cross-tabulation is fundamental and common. If, in V14, you want to do that, it’s much cheaper and often easier (Adam doesn’t touch on my points about the complexity of eVar cross-tabulation at all) to do it with sProps. I just don’t see an argument here much less a convincing one.

Since we don’t really disagree on pathing there isn’t much to add. I’m pretty sure I never suggested the indiscriminate purchase of pathing on sProps. People accuse me of many things, but indiscriminancy isn’t often one of them. I did suggest that one of my pet peeves is when people pay for pathing and don’t use it. Yep. And I still think it’s illustrative of my thesis and what I didn’t like about “Pet Peeves”. Adam’s pet peeve on pathing was a case when implementations path a static, unchanging variable. Undeniably stupid and a total waste, but, again, it would just never have crossed my mind as a serious problem. Underuse of pathing is a far, far more common and serious problem than misuse or overuse.

So that’s uncommonness. I feel like my arguments here are untouched. sProps add value and do so in a pretty significant number of cases and at less cost than if you rely solely on eVars.

So is there a hidden downside?

Here’s the next set of Adam’s bullet-points on the problems of duplication:

  • Over-implementing variables and enabling features unnecessarily can cause report suite latency
  • Over-implementing variables can increase page load time, which can negatively impact conversion
  • Over-implementing variables and features can cost additional $$$ as described above (e.g. Pathing, Correlations)

I’d lump all of these under the category of hidden costs. Both latency and page-load time are legitimate issues, but I’m not so sure about dollar-costs. Over-implementing features can cost you money, but I’m unclear what that has to do with duplication of eVars as sProps. Unless you start adding extra fee items, there simply is no cost to this.

I suppose one could argue that if you make an eVar an sProp you’ll then be somehow tempted to turn on pathing or twenty-item correlations unnecessarily and that will cost you money. But that feels a bit like the arguments I hear on political campaign ads – “With Prop 99, politicians will give the power to Insurance companies to trick your senile grandmother out of her inheritance.” Uh yeah. I’m sure that’s what the Proposition is really about. Duplication of eVars as sProps doesn’t cost a dime and implying that it does is simply not right.

So what about page-time and latency?

Let’s deal with latency first. It’s very hard argue with latency issues, which is why I’ve always felt it’s a kind of a technical “boogie man” that implementers throw out to keep us analytic folks off-balance. I’ve always taken the attitude that Omniture latency issues are Omniture problems not customer problems. In fact, I’ll throw out another of my pet peeves here. It’s when Omniture Marketing folks sell analytics software based on the vastly greater number of variables and events it supports and then Omniture technical folks tell you that you can’t use those variables or your system won’t work well. That’s a pet peeve of mine for sure! In fact, though, I don’t think they do this as much as they used to because the system mostly works pretty well.

I’ve observed that most latency issues on Omniture are systemic not report suite specific (that’s why we usually have multiple customers suffering at the same time). That means that if you don’t take advantage of features but everyone else does, you pay the price but get no benefit. You could refuse to login to SiteCatalyst at 9AM because it causes UI latency, but you won’t get your reports first thing in the morning if you don’t. When Omniture latency issues strike, our customers with just a few variables seem to suffer just as much as our customers with a veritable variable banquet.

Nor am I convinced that sProps are a significant contributor to report suite latency. They are much simpler to process and report on (for Omniture) than eVars. So if you’re genuinely concerned about latency in your report suites, you probably need to concentrate on removing eVars and events. I’d be very surprised if duplicating 20 props adds the overhead of adding a single event in an implementation rich in eVars. Adam knows more about this internal guts stuff than I do, but I doubt that he really thinks sProps are a huge driver of latency and, in any case, I’m not willing to sacrifice my implementation (or my client’s) on the threadbare hope that it will improve Omniture latency. I think this is a case where too much concern for the technical implementation on the Omniture side short-sells the client’s real interest.

In my last post, I argued that page load-time was critical in today’s organization. So Adam has me on page time since duplication does add a tiny bit of code. Still, by my calculation, if your average page passes 30 eVars, then duplicating them as sProps would cost you about 180 bytes. But that assumes that you don’t need ANY of them passed as sProps for a good implementation. If I’m right and it’s a common case that you actually need many of them as sProps anyway, then full duplication might cost you somewhere between 20-80 wasted bytes. As fanatical as I am about page load times, that’s really, really small in the scheme of an Omniture implementation. In fact, there are cases where wholesale duplication might actually save you code space since it can be handled on the back-end. If wasting 20 bytes in your page code is enough to qualify as a top implementation pet-peeve, have at it!

These issues all seem like small potatoes to me. Perhaps microscopic might be a better word than small. The benefits to duplication would have to be tiny not to outweigh these concerns and even if there were NO BENEFITS it’s hard to see how, with these as the drawbacks, the practice of wholesale duplication would count as the #1 pet peeve in Omniture implementations.

Which brings me to the two most interesting points: variable conservation and adoption. Here’s the first:

  • When you implement SiteCatalyst on a global scale, you often need to conserve variables for different departments or countries to track their own unique data points. This means that variables (even 75 of them!) are at a premium. Therefore, duplicating variables has, at times, caused issues in which clients run out of usable variables.

I completely agree with this. It’s pretty much the only reason why at Semphonic we don’t indiscriminately replicate sProps. The loss of valuable sProp slots in a global implementation is the single biggest reason why I classify it as a sloppy practice. That being said, Adam’s original pet peeve specifically stated that most sites didn’t need to use many sProps. Here’s the exact language:

I only set an sProp if:

• There is a need to see Unique Visitor counts for the values stored in the sProp

• There is a need for Pathing

• You have run out of eVar Subrelations and need to break one variable down by another through the use of a Correlation (which will go away in SiteCatalyst v15)

• There will be many values (exceeding the unique limits and you just want data stored so I can get to it in DataWarehouse or Adobe Insight

For the most part, that is it [highlights mine]… Beyond that, I tend to use eVars and Success Events for most of my implementation items.

That last sentence was the one that particularly bothered me and that I objected to in my original post. If you don’t need sProps for much, you probably don’t need to worry about their conservation! But if it comes right down to it, I agree that variable conservation IS what makes duplication at least a mildly bad practice and I think this point is pretty much in tune with my original argument. You don’t waste sProps because they do have value. Duplication isn’t usually a waste, but sometimes it is – and that’s why wholesale duplication is lazy and less than ideal even if, in most cases, it isn’t a very big deal.

So what about Adam’s point about adoption?

Most importantly, however, is the impact on adoption. Again, I may be biased due to my in-house experience, but here is a real-life example: Let’s say that you have duplicated all eVars as sProps. Now you get a phone call from a new SiteCatalyst user (who you have begged/pleaded for six months to get to login!). The end-user says they are trying to see Form Completions broken down by City. They opened the City report, but were only able to see Page Views or Visits as metrics. Why can’t they find the Form Completions metric? Is SiteCatalyst broken? Of course not! The issue is that they have chosen to view the sProp version of the report instead of the eVar version. That makes sense to a SiteCatalyst expert, but I have seen the puzzled look on the faces of people who don’t have any desire to understand the difference between an sProp and an eVar! In fact, if you try to explain it to them, you will win the battle, but lose the war. In their minds, you just implemented something that is way too complicated. You’ve just lost one advocate for your web analytics program – all so that you can track City in an sProp when you may not have needed to in the first place. In my experience, adoption is a huge problem for web analytics and is a valid reason to think twice about whether duplicating an sProp is worthwhile. While I’ll admit that duplicating all variables certainly helps “cover your butt,” I worry about the people who are at the client, left to navigate a bloated, confusing implementation

I notice that while Adam objects to my pitting an “analyst” experience against an “implementers” this argument seems not unwilling to pit “in-house” experience against “consulting” experience. I’m not complaining. I think both are fair enough, though we at Semphonic certainly support a lot of Omniture users. I’m sure sometimes “in-house” vs. “consultative” makes for an interesting difference, though I don’t think this is the place.

Still, it’s an interesting point and the hardest one to be sure about – maybe the only one in the post that really gave me pause. Do I agree with it? Nope, though I sure had to think about it a bit. The case Adam describes is dead-on. Could happen. I’m sure did happen.

But here’s another one that’s every bit or even more common.

User opens up a report on an eVar, tries to cross-tabulate it with page name, finds it’s not sub-related and that he’ll have to pay extra to see that. Decides he’d like a unique visitor count. But no – not available. Cross-tabulates it with another eVar and gets a set of numbers that just don’t seem right. Plays around with the report for bit and then walks over and says, “Hey, this thing gives me totally different numbers when I look at something called Instances and something called Page Views- what the heck is that and why does one variable have two totally different values?”

Been there, done that.

Omniture is a deeply complicated system. The difference between eVars and sProps is notoriously difficult to communicate and understand. There are times when I’m not sure whether duplication is appropriate. I’m unconvinced that the right answer to this complexity is to remove the information that is it’s source. What’s more, if you don’t duplicate sProps and eVars, then the user has to know which list any given variable is likely to reside in. How are they do that unless they understand why a variable is put in one place instead of another? Yes, you can re-engineer the menu system, but then you make life harder for regular Omniture users. As I wrote in my original post, many aspects of eVars (because they are state not instance variables) make them harder for users to understand and use appropriately than sProps; so while I can see Adam’s point here, I think the argument actually cuts in the completely opposite direction.

Eliminate the duplication, and you’ll force users to always deal with an even more difficult set of variables than if you retained both and you’ll force them to know where to look everytime they want to find a variable.

You’ll have to be a pretty sophisticated Omniture user to weigh our arguments here – and if you are, you’ll likely have your own opinions. But I think (just guessing) that a heavy majority would fall on my side of this particular issue. What’s more, if I’m right that many variables need to be duplicated, you aren’t really adding confusion by duplicating another four or five. The argument only works at all (and I don’t think it does work) if you assume that you won’t have to duplicate any or more than a few sProps.

I figure that by this point in my post, I’m down to the truly hard-core Omniture nuts (which surely includes both Adam and I). Some of these points are extremely technical and others frankly arcane. Do we really disagree much over Listprops? Probably not. Latency? Put in in the context of a different issue and I rather doubt it. In fact, I doubt we disagree over much of anything except – when you get right down to it – the real nub of my original post.

The nub of Analyst vs. Implementer is that the stuff that really bugs me is different than what really bugs Adam. You can disagree with my thesis about why that’s the case and still admit that there’s something interesting going on there that isn’t really captured in a discussion (however scintillating) about sProps vs. eVars. I believe (and think I’ve made a pretty darn convincing case) for the idea that there is far too much focus on the technical aspects of Omniture integrations and far too little on the analytic aspects. That focus leads to many, many sins of omission and to implementations that are much less useful than they ought to be. Lump every single one of Adam’s top implementation pet-peeves together, and they mean less to me (and I believe should mean less to you) than a single missing meta-data variable!

But if you’re really hung up on sProps and eVars, what’s my best and final advice?

If you’re building an implementation it’s best (surprise!) to actually know why/whether you need a variable as an sProp or an eVar or both. It will save you six or seven bytes of page weight. It will give Omniture one less excuse for your report latency. It will make for a cleaner global implementation. It will, by only duplicating appropriate variables (of which there will be many), perhaps contribute to some long-term user understanding of the difference between sProps and eVars.

Take all of this together and it’s much, much more important to have an interesting idea for actually using one or both.

But if you really aren’t sure whether an eVar might make a good sProp (and vice versa), then by all means, duplicate. To my mind, duplication is a perfectly acceptable (free, very low-impact, very easy to implement, sometimes beneficial) way of never having to say you’re sorry – and I haven’t read a single even mildly convincing reason that makes me think otherwise.

Republished with author's permission from original post.

Gary Angel
Gary is the CEO of Digital Mortar. DM is the leading platform for in-store customer journey analytics. It provides near real-time reporting and analysis of how stores performed including full in-store funnel analysis, segmented customer journey analysis, staff evaluation and optimization, and compliance reporting. Prior to founding Digital Mortar, Gary led Ernst & Young's Digital Analytics practice. His previous company, Semphonic, was acquired by EY in 2013.


Please use comments to add value to the discussion. Maximum one link to an educational blog post or article. We will NOT PUBLISH brief comments like "good post," comments that mainly promote links, or comments with links to companies, products, or services.

Please enter your comment!
Please enter your name here