Governance and other Dirty Words

0
44

Share on LinkedIn

In what has become something of a tradition, Phil Kemelor (VP here at Semphonic) released this year’s edition of our Profiles in Analytics research. Profiles in Analytics is based on our own survey research of the Web analytics community. Each year has been fascinating and so, too, is the apparent evolution of the community. There are three parts to Profiles in Analytics – all based on the survey data. There is, first, the actual summarized survey results. The second piece, which gives the research it’s name, is a set of representative profiles based on the survey responses. The final piece is a write-up of the research that highlights some of the key trends and most interesting findings. All are available on the Semphonic web site.

It’s obvious from a year-over-year comparison that there has been fairly significant growth in the size and maturity of digital measurement groups. Well, that should be obvious to most of us. What’s more interesting is the clear growth in maturity across a number of different fronts including Phil’s focus in the write-up: Web governance. But just how important is Web governance – does it allow organizations do more with less or is it, perhaps, an excuse to do less with more? The survey suggests some interesting answers. You can check out Phil’s write-up along with the survey results and the always interiguing profiles here.

I also wanted to address a few of the questions/comments that I’ve gotten in the last few weeks on prior posts. It’s been incredibly busy here with the year-end rush to get work out the door before the holidays so I haven’t had boatloads of time but there were a couple of comments I wanted to address.

I’ll start with this comment from Jennifer Roberts at Collective Intellect with reference to my post on the futility of Total Mention Counts:

I think I understood where you were going with the whole futility of measuring mentions. Is the value of a NY Times mention vs a blog mention important? I think you definitely have a point but I’m wondering if it isn’t necessarily the mention but what it is that is expressed is what’s important. I think the initial analysis that identifies the ‘mention’ is only the starting pt., the next step is to extract meaning and that can mean different values to different companies. Retailers may be more interested in consumers expressing an intention to purchase; entertainment companies may focus their analysis on consumers expressing an intention to view, reducing the total volume numbers to an overall trending value. I don’t know that you can discount the ‘total mention count’ anymore than you can ‘total visits’; each provides some context for the more important metric.

On the whole, I don’t think Jennifer and I fundamentally disagree. We both think the most interesting questions are at a lower level than Total Mention Counts. But I’m more deeply skeptical about the value of the Total Mentions Count (as I am, to a lesser degree, about Total Visits – we at Semphonic strongly discourage Web reporting around total site metrics). Not every metric is worth reporting on just because it exists.

Let me give an example.

I could add together Entries and Exits on a Website and call it “Doorways“, and I can almost believe that if a tool reported this metric, that some analyst would use it. But “Doorways” isn’t interesting – it hides, not reveals, the interesting level of reporting. There’s nothing interesting about adding the Entries and Exits on a page. Similarly, I don’t see value in adding mention counts across fundamentally different channels. If you’re goal is to measure PR, then you need to focus on one kind of mention count. If you’re goal is to measure Consumer Sentiment, then you need to focus on a different kind of mention count. It’s not clear to me that when you add these two types of samples together that you’re counting anything more real than “Doorways.” I’m just not sure what I’m measuring when I use a statistic like “Total Mention Count” and I’m coming to think I’m not measuring anything at all.

I’ve also been thinking a lot more about my post on the changing role of Customer Satisfaction Surveys. In that post, I argued that most site intercept surveys were over-focused on the wrong sort of questions – one’s about site artifacts not customer needs and attitudes. I’m more convinced than ever that this is true, but I’ve also been thinking about some of the sampling problems inherent in online surveys -problems similar to one’s that I wrote about in the Social Media space.

I got this comment from Bryan Henn:

Gary, Great piece. It is important for any analyst to remember the age of social platforms, and the diverse landscape for user generated content. One thought I have shared with my team members has related to the idea that noise is often created around the volume of product sold, and the discounts that apply. Coupons, free offers, etc, plague the ProActiv market as you referenced, where as you are unlikely to type coupon for Ford 500 into Google and get many results. I read through the WCAI study and “Insights for Practitioners” but it seemed to lack this thought process. As you mentioned though, I think its best that practitioners continue to apply tested measurement and optimization strategies to an emerging segment such as social. If we can optimize conversions or attract engaged visitors through social channels, we are doing something right. Thanks.

Bryan’s point is dead-on. One of the most challenging aspects of doing Social Media measurement is separating out the fruits of your own (and competitive) marketing efforts from conversations that reflect un-channeled consumer conversation. This is really hard, but it’s a key part of building a good social sample – regardless of your intent. It’s interesting to measure the impact of your (and your competitors) marketing. It’s interesting to measure current consumer sentiment. But just as with Total Mention Counts across channel types, it’s not interesting to mash these two things together.

Reflecting on this point, however, I’ve realized that site intercept surveys are quite vulnerable to similar problems. There are some obvious limitations (as well as advantages) in the use of site intercept surveys to measure consumer sentiment, but there are hidden challenges here as well. I’ve been thinking about both and I’m hoping to discuss this in detail in my next post. I’ll foreshadow by saying that these problems place a premium on behavioral integration but also, rather unfortunately, make a mockery of any attempt to use online survey data for benchmarking purposes.

This whole issue of the right focus for online survey research is a deep one with all sorts of interesting offshoots. There are organizational and process issues here as well. David Harrod pointed that out in this comment:

I think the insight that using site csat tools for purposes beyond site csat is essential, and one you get to within 6 mos. of implementation. The interesting thing about both is that they rely on the same type of engagement between the product teams or business stakeholders and the analytics team.

I think that’s right on both fronts. There’s usually a gap between deploying online survey technology and understanding what it’s really for (though six months may be a “best-case” experience). Part of the reason for this gap is a communications disconnect between the survey owners and the people who could most take advantage of the data – a disconnect that effects many other aspects of Web analytics practice. In other words, one of the reasons why online survey’s are so uninteresting is that they aren’t designed, in most cases, by the right people!

Republished with author's permission from original post.

Gary Angel
Gary is the CEO of Digital Mortar. DM is the leading platform for in-store customer journey analytics. It provides near real-time reporting and analysis of how stores performed including full in-store funnel analysis, segmented customer journey analysis, staff evaluation and optimization, and compliance reporting. Prior to founding Digital Mortar, Gary led Ernst & Young's Digital Analytics practice. His previous company, Semphonic, was acquired by EY in 2013.

ADD YOUR COMMENT

Please use comments to add value to the discussion. Maximum one link to an educational blog post or article. We will NOT PUBLISH brief comments like "good post," comments that mainly promote links, or comments with links to companies, products, or services.

Please enter your comment!
Please enter your name here