X Change Redux

0
15

Share on LinkedIn

Given my once a week pacing, I could probably spend 3 or 4 months debriefing on X Change. But that might be a little too much “looking back”. So rather than bang out a long series of weekly posts, I’ve decided to try and create a single summary of some of my interesting take-aways and notes. Sadly, I have no notes from the excellent discussions of the two Huddles I ended up leading. I’m just not that good at multi-tasking! If anyone out there did a better job, feel free to drop me a line with your thoughts (fully-baked or just notes – I’ll take either and give full credit)…

Mobile Measurement Huddle

By far and away the most common question that gets asked in mobile measurement is “Can we track people across devices?” The answer, of course, is often “No” – which leads to much disappointment. But is this really the most important question? How many of us have experienced truly unusable mobile Apps? The state of the art in mobile App GUIs still seems to be in flux and there are a lot of opportunities for optimization. Clicktale, one of the leading tools for UI optimization, now has a mobile version available and the ability to use CEM type tools in mobile makes good sense. There are still tremendous opportunities for GUI measurement even with traditional Web Analytics SDKs. To measure the GUI, though, requires much more granular measurement than is traditionally done and a fundamentally different kind of reporting. It also requires breaking the page-based measurement paradigm to measure GUI App functionality well. My previous blog had more thoughts on the implications of this for a broader measurement framework.

But I think the result of this discussion is a pretty big takeaway. Mobile Apps need better GUI measurement and it’s much easier to do this than to solve some of the intractable problems around customer journey. When you’re wiring up Apps for measurement, thinking about GUI optimization will lead to a very different type of implementation (Units of Work, Foreground v. Background, Scrolling and Pinching, etc.) than if you’re focused on more traditional measurement.

Not that every problem around customer journey IS intractable – and mobile makes some things quite a bit easier. Web to Call is a big gap in most traditional measurement. You have to be very aggressive with dynamic 800 numbers to have any chance of understanding the flow between Web and call – and except for logged in sites it’s almost impossible to accomplish at the individual level. In mobile, however, you can capture the click on the phone number – giving you far better insight into the individual journey from App or mobile Web to Call Center.

Data Democratization Huddle

Democratizing data is a critical aspect of enterprise digital analytics. To be used, data has to be available, understood, almost ubiquitous. There’s a lot involved in making that happen: from both the process side and the tool side.

Depending on the maturity of your organization, you may need to focus on data awareness – getting people interested in and excited about the data. Our group had a bunch of different techniques for getting this done. A couple that I thought were particularly interesting (and a bit different) were to focus on A/B testing to drive curiosity and to create competition between groups driven by reporting. The relationship between A/B testing and measurement is an interesting one. I’ve long believed that the two are much too poorly integrated. Measurement ought to be the foundation of any good testing program and techniques like our Use-Case and Two-Tiered Segmentation analysis provide rich measurement frameworks within which to drive intelligent testing. That being said, there are many problems that are more amenable to testing than to analysis and the outcome of almost any good analysis is likely to be an A/B test. So the relationship between the two is very tight. What makes the idea of A/B Testing as a wedge into measurement interesting is the ease with which A/B test results can be absorbed and disseminated. Particularly if you wrap a good measurement language system (like Functionalism or Use-Cases) around the testing results, you’ve got a great method for introducing real measurement thinking into the organization in a relatively painless guise.

I also very much liked the concept from one of the leading UK retailers to create competition between groups. A fair amount of their measurement and analytics adoption has piggy-backed on a healthy competition between the men’s and women’s lines. Where natural competitive fault lines in the organization exist, finding ways to get one group to successfully use measurement will lead to very rapid and widespread adoption. You can spend your time moaning about how dysfunctional your organization is or you can find ways to make it work for you.

Our group also had a pretty deep discussion of the role of the analyst in implementation and the role of the Web analytics tool in reporting. Both topics are too broad to cover easily in a short summary, but I thought the general consensus in each case was interesting and right (if controversial).

For the most part, our group seemed to favor keeping the analytics implementation separate from analysts. Analysts tend to be slipshod at governance and MDM – leading to scattershot implementations and generally poor quality of data. I have mixed feelings about this. The general idea seems right, but, as I’ve pointed out before, there are strong reasons why significant portions of the implementation design need to be in the hands of an analyst not an implementer. This isn’t necessarily contradictory – just evidence that you need to be careful where you draw the line between the functions.

There was also a fairly strong sense in our group that democratization is best done outside the Web analytics tool. I’m a fairly strong advocate of this. In general, I want data democratized within a centralized framework. I often say that there should be no “Democratization without prior Centralization.” Perhaps it’s my fundamentally conservative leanings coming through, but so much of the tool language of Web analytics is misleading or slipshod that I’m naturally inclined to believe that good measurement requires more preparation and work. I don’t want users learning about digital analytics from SiteCatalyst or GA. Our current reporting work – whether it’s delivered in embedded models or as segmentation systems – is WAY beyond the type of reporting you can deliver inside the tool. If, like me, you think these techniques are really much better than traditional KPI/metric based reporting then you’re pretty much committed to the view of democratization outside the standard toolset.

Tools of the CRO Professional

This Huddle was a kind of “Mr. Toad’s Wild Ride” through a formidable array of different tools in our ecosystem. In preparation for the Huddle, Craig Sullivan had put together a fantastic cheat sheet of tools. We covered session replay, email testing, mobile and device testing, browser testing, voice of customer, survey tools, competitive monitoring, social media monitoring, form analytics, microdata, guerilla testing, performance testing, customer feedback, split testing, SEO tools, and even cloud workflow tools. Some of this, inevitably, was old hat for me. There are types of tool here about which I really do know quite a lot and in such a rapid survey I can’t hope to hear much about social media tools or customer feedback and survey tools that I don’t already know.

That being said, there were some great learnings. One of our perennial pain points is mobile device testing. When we’re doing implementations, we need to get devices. We all have our own iPhone or Galaxy, of course, but that isn’t the whole of the mobile universe. Over the past few years, we’ve accumulated a bit of junkyard collection of hardware we’ve needed to test on. Turns out there are some great services for this that will help you get a fully configured hardware environment to test on for very little cost. Services like perfectomobile.com, deviceanywhere.com, mobilexweb.com and opendevicelab.com were brought forward and discussed. Handy stuff.

It’s a similar situation with browser testing. As Craig pointed out, it’s an everyday occurrence to see key Web pages broken in one or another variant of a browser. We all pretty much have IE, Chrome, and Netscape on our machines. Most of us also have Safari. But what if you need to test an older version? Here are some resources that will facilitate browser testing without mucking up your machine: crossbrowsertesing.com, cloudtesting.com, multibrowserviewer.com, and saucelabs.com

Device and Browser testing are real nuts-and-bolts types of analytics tools, but they are often indispensable.

On the other end of the spectrum from these nitty-gritty tools for debugging were some of the tools we discussed for prototyping and productivity. These aren’t tools about analytics or measurement but they are clearly part of our ecosystem and most were new to me.

Prototyping tools we discussed included pidoco.com, verifyapp.com, fivesecondtest.com, conceptshare.com, and usabilla.com.

On the collaboration front, some of the tools are ones many of us have probably used: Google Docs of course, join.me, and basecamp are all pretty common. But Conceptshare (workflow), protonotes.com (HTML collaboration), lucidchart.com (flow charting) and trello.com (project organization) are all new to me and people in our Huddle thought they looked/were quite handy.

I’m pretty uniquely positioned to see a huge variety of tools in the digital ecosystem, and I still got a lot out of this discussion. That’s the beauty of X Change. If you bring a bunch of smart people together and you listen closely, you’re bound to learn something!

Don’t miss out on all the great learnings. It’s time to sign-up for X Change right here in the United States. Come September, the conversation will be hotter than the fire-pits at the lovely Ritz Carlton in Laguna Niguel (LA/Orange County)! See you there.

Republished with author's permission from original post.

Gary Angel
Gary is the CEO of Digital Mortar. DM is the leading platform for in-store customer journey analytics. It provides near real-time reporting and analysis of how stores performed including full in-store funnel analysis, segmented customer journey analysis, staff evaluation and optimization, and compliance reporting. Prior to founding Digital Mortar, Gary led Ernst & Young's Digital Analytics practice. His previous company, Semphonic, was acquired by EY in 2013.

ADD YOUR COMMENT

Please use comments to add value to the discussion. Maximum one link to an educational blog post or article. We will NOT PUBLISH brief comments like "good post," comments that mainly promote links, or comments with links to companies, products, or services.

Please enter your comment!
Please enter your name here