Building a Customer Intelligence System – a Data Warehouse for Customer Attitudinal Data

0
95

Share on LinkedIn

In my last post, I described how in the 2012 presidential campaign the data science team for President Obama fundamentally re-engineered their use of survey and customer attitudes research to gain a significant competitive advantage.

Two pieces of what they did really stand out as significant.

First, they broke decisively from the traditional opinion research paradigm focused on broad-brush random samples to predict the election outcome. By collecting massive amounts of research and tying it to individual voter behavioral scores, they were able to use survey research to fine-tune predictive models of turnout and choice. This system created a remarkably sensitive research instrument that allowed them to predict close precincts much more accurately than the Romney campaign or media pollsters. In a tight election, this strategic information advantage ended up playing a significant role in everything from debate strategies to media targeting.

Even more important, in my opinion, however, was that way in which the Obama campaign USED this massive, behaviorally integrated opinion research database to test campaign messaging fine-tuned to very small segments of the voting population. They created a tightly coupled system where collection and analysis of voter attitudinal data were used to develop, refine and test themes used at EVERY level of the campaign from individual voter out-reach to mass media.

Think there’s nothing new about this?

Sure, every campaign uses opinion research data to tune and analyze the effectiveness of campaign themes. But those surveys are high-level, collected from small samples, and non-integrated with knowledge of the underlying voter behavior. As research tools, this makes them a very blunt instrument. By vastly extending the reach of online research and integrating it tightly to behavioral models, the Obama campaign built a system where they could test messaging to micro-segments on a continuous, real-time basis at very little cost (here’s a very fine article describing in more detail what they accomplished).

The implications for enterprise marketing are plain. Senior decision makers and “marketers” in the Obama campaign had daily, near real-time access to detailed information on how their message was playing in every key region of the country and to every potential swing voter. Even better, they had the ability to rapidly test messaging strategies to evaluate their real-world effectiveness by segment.

Do you have anything remotely similar?

Do decision-makers in your organization use opinion research data to make decisions on a daily basis?

Have you ever even looked at one of your enterprise online surveys?

Have you seen how fantastically uninteresting the questions are?

Have you ever tested actual communication strategies with your online research?

Do the folks running your site tests and email campaigns even look at opinion research data?

Enterprise customer research is deficient in almost every respect by comparison. Few people are surveyed. The resulting data is rarely integrated into behavioral data and even when it is the sample size is too small to be useful behaviorally. There are no models of decision-making and choice exploiting the combination of behavioral and attitudinal data. The information is rarely disseminated throughout the enterprise, and decision-makers almost never use survey data to test marketing strategies at any but the broadest consumer level.

Why is that? How is possible that even a well-run and well-financed but ephemeral operation like a presidential campaign should have better customer attitudes research than the marketing departments of Fortune 500 companies?

They shouldn’t and the remedy is simple. You already have a great model for what you should be doing with customer attitudinal data in your Customer Data Warehouse. Think about how the mature enterprise routinely treats customer transactional data:

  1. The data is centralized in a warehouse
  2. It’s collection is standardized and there are rigorous quality control procedures to ensure it’s integrity
  3. Data from multiple sources is normalized and placed in a standard model making it easy to join the data and extract it in interesting business combinations
  4. Powerful reporting tools are used to provide direct access to the data
  5. Regular reports are distributed to decision-makers trending and contextualizing the data to increase its value.

This is a terrific blue-print for using customer attitudinal data and paying attention to this blueprint can cure a multitude of problems in your current research program.

Right now, the vast majority of enterprises don’t consolidate information from ANY of their customer attitudinal research programs. OpinionLab data is one silo. Online Survey research is another. Offline surveys are in another. Social Media is somewhere over there. Call center data is somewhere else. In very few enterprises is there a single centralized place where customer attitudes research comes together.

Which also explains why there seems to be virtually no standardization in that research. We routinely see frustrating inconsistencies in the various components of the customer research program. Online and offline surveys may categorize visitors differently (using different age or income categories), have different wordings on basic questions (like role or relationship), and use different qualification criteria. Many efforts are sporadic and the results never tracked over time. These differences make comparison of data across sources difficult for impossible. It’s silly and easily fixed, but until you actually put the data together, no one need to solve the problem because no one is using the combined data.

In a customer data warehouse, architects think hard about how to combine and model the data to make it useful and predictive. There is no similar function when it comes to customer attitudes research. The result is that this data is almost NEVER used in combination. Have you ever even seen (much less used) a report that shows how customer attitudes about key decision factors are changing in comparison with evolving call-center call-types? Enterprises act as if there’s no relationship between customer attitudes monitored in one space and customer attitudes (or behavior) monitored in another. Clue phone. It’s the same customers.

In the Customer Data Warehouse, your line managers usually have direct access to the data with modern BI tools. They use this transactional data on a daily basis to monitor the health of the business and make decisions. No equivalent system exists in most enterprises for customer attitudes research. Not only isn’t the data available in any appropriate tools, but the siloed tools around customer attitudes are often extremely poor at data democratization. There’s a good chance that most of your marketing managers have never accessed customer attitudes data in anything other than a flat report. You just can’t USE data this way.

Finally, most enterprises have worked hard in the past two years to create scorecards that track key performance measures across the business. If you’ve incorporated any aspect of customer attitude research into those scorecards it’s probably an NPS or site-wide satisfaction score. This is useless. I’ve written elsewhere how misguided this approach is and there is a better alternative. You customer’s deserve more than a single score supposedly representing your success with them. Do you think the Obama campaign had a single election-wide score? Do you think that’s even meaningful? What the Obama campaign realized and acted upon is the understanding that customer attitudes matter at a very deep-level. To build good messaging strategies, they needed to understand key segments of swing voters (whether swing in attitude or turnout). Your marketing problems are almost certainly identical. This makes customer attitudes worthy of an enterprise scorecard unto themselves. One that unifies all the sources of customer attitudinal data in a single, powerful view. A view that segments customers in a consistent, thoughtful fashion; that tracks key attitudinal measures for each of those segments over time; and a view that constantly pushes the envelope of learning around those customer segments by continually exploring their evolving drivers of choice.

At Semphonic, we call this attitudinal warehouse combining these five directions a Customer Intelligence System. Building such a system may be the single most important, valuable, and cost-effective means in your arsenal of making digital measurement matter.

[If you’d like to meet at the Omniture Summit drop me a line – would love to chat!]

Republished with author's permission from original post.

Gary Angel
Gary is the CEO of Digital Mortar. DM is the leading platform for in-store customer journey analytics. It provides near real-time reporting and analysis of how stores performed including full in-store funnel analysis, segmented customer journey analysis, staff evaluation and optimization, and compliance reporting. Prior to founding Digital Mortar, Gary led Ernst & Young's Digital Analytics practice. His previous company, Semphonic, was acquired by EY in 2013.

ADD YOUR COMMENT

Please use comments to add value to the discussion. Maximum one link to an educational blog post or article. We will NOT PUBLISH brief comments like "good post," comments that mainly promote links, or comments with links to companies, products, or services.

Please enter your comment!
Please enter your name here