The Evolving Role of Opinion Research

0
25

Share on LinkedIn

In planning my panel on Cross-Industry Customer Analytics, I had certainly hoped to explore the growing need for and use of online opinion research. It’s not a new subject for me. In my long series this year on the Convergence of Database Marketing and Web Analytics, I several times discussed the critical role that opinion research can play. It’s a central part of Customer Analytics and it fulfills functions there quite different that when confined to Web analytics.

Opinion Research can form the bridge between behavioral facts/characteristics and the reasons why they exist.

It can help us target across channels and mediums where no behavioral patterns exist.

It can provide the necessary information about “drivers of choice and decision-making” so that we can build effective personalization, offer, and creative strategies.

In the traditional marketing world, opinion research had all of these functions and each is, if anything, even more important in the digital realm. I think every member of my panel probably would agree with this basic list and I thought that there were several excellent examples of traditional research methods being used to drive online marketing.

What I was a little less satisfied with, in retrospect, is our discussion around online survey research. It seems to me that the vast majority of online site research isn’t addressing any of these issues. Instead, the online research we see is focused on assessing site satisfaction and it’s drivers.

This isn’t all that surprising. Online surveys gained their market traction in a world where companies were struggling to understand how to make their Website better and where the owners of the Website had little or no marketing responsibility beyond the Web. It was a world where Customer Analytics, Personalization, MVT, Targeting and Segmentation were essentially unknown. So why not focus on site satisfaction, site navigation, site images, and site structure? That’s all there was to focus on.

That is no longer our world and I believe there is an increasing gap between the need for customer research that explores segmentation and drivers of choice and online surveys that explore site satisfaction and it’s drivers. Almost every interesting survey project we’ve been involved with this year involved questions and research that went well beyond (or were entirely unrelated to) the Website. Ironically, this was true even for pure-play internet companies.

It’s not that measuring site satisfaction and it’s drivers is wrong or bad. It isn’t. It’s just far short of the whole need. By focusing surveys and reporting exclusively in that arena, we’re precluding the use of online surveys for customer research. As our panel did note, you can only survey people so much and we’re already experiencing widespread “survey burn-out” on the web. You can’t simply tack more questions or more surveys onto your research program. Something has to give.

Almost every survey instrument I see – regardless of company or industry or vendor – is asking the old/wrong set of questions. A set of questions that is, at best, of secondary interest. A set of questions that establishes little or nothing about the customer’s state-of-mind, their product interests, their needs, or attitudes… about anything except the site as a tool or their willingness to recommend the company.

Who cares!

This isn’t critical marketing information. It fulfills not one of the three functions I started this post with. We cannot target folks based on their perception of our navigational structure. We can’t segment based on how they liked the images on the site. We cannot design an MVT program using a NetPromoter score.

As a Web analyst, I do like getting some of this information. It can be quite useful in deep-dive analysis. No, I don’t need it every week or every month. If you’ve been tracking this stuff, you know already what I’m going to say. It doesn’t change. Hardly. Ever. At. All.

So here’s my suggestion. Sure, run your high-level site satisfaction and visit-type and NPS (if you must) questions all the time. That’s three questions. Run your deep site survey every six months or every year or after a big re-design. That’s plenty for behavioral analysis.

Now use all those open surveys and open questions to explore what your customers really need, what they think, why they decide the way they do, what they want the online channel to accomplish for them, and what you can do about it all. That’s what opinion research is for. It’s harder. It takes more thought and more analysis. But it is worth oh so much more!

So that’s it…I’m closing my short series on the WAA Philly Symposium. I’m confident my posts have given you a sense of how much I enjoyed the day. There’s always so much to learn in an ever-changing discipline like ours. I hope that they’ve also deepened the learnings even of those who were there. Seen through the light our practice here at Semphonic (a practice deeply concerned in every aspect of digital measurement and Customer analytics), I think the presentations and discussion are even more meaningful. In panels and presentations we get little flashes of illumination. In the afterward (assisted, in my case, by several LONG plane flights) we have the opportunity to test those illuminations against what we know; to extend them and reflect on the new or changed understandings they create. In relating each topic to work or interests of mine, I’ve tried to do that and to show how much I took away and how thought provoking the discussion was.

Republished with author's permission from original post.

Gary Angel
Gary is the CEO of Digital Mortar. DM is the leading platform for in-store customer journey analytics. It provides near real-time reporting and analysis of how stores performed including full in-store funnel analysis, segmented customer journey analysis, staff evaluation and optimization, and compliance reporting. Prior to founding Digital Mortar, Gary led Ernst & Young's Digital Analytics practice. His previous company, Semphonic, was acquired by EY in 2013.

ADD YOUR COMMENT

Please use comments to add value to the discussion. Maximum one link to an educational blog post or article. We will NOT PUBLISH brief comments like "good post," comments that mainly promote links, or comments with links to companies, products, or services.

Please enter your comment!
Please enter your name here