I hate being only mostly right!
Before we dive down into what we mean by analytics driving testing and how that might/should work, I wanted to talk a little bit more about pure-creative testing. I saw a presentation recently by one of the 2012 Obama team and she talked extensively about the email testing they did. They had four full-time writers cranking out emails. This was just on the donor side of the house. Four full-time writers! God knows how many people there were generating creative for the whole campaign. Naturally, the writers had different styles and often took very different approaches to the same material. One might produce a folksy appeal from the President. Another might choose to produce a letter from Michelle (who was very popular among donors apparently). A third might write a rousing “Republicans coming over the hills with their red necks and shotguns” missive.
Now I’m sure it’s possible that analytics could drive some of those choices but it hardly seems central. You talked a bit about the role of the creative brief – and I think that’s important and largely ignored in most enterprise analytics/testing groups. Profiling customers from a creative perspective requires a very different kind of analytics than most digital teams are used to building. At the very least, it requires the integration of VoC and behavioral data and the VoC data can’t be the crumby site navigation stuff that most people collect.
How can you best use analytics to help writers and designers? What makes a great creative brief?
No matter how good your creative brief is, though, I’m not sure it will do anything other than inspire even more possible variations of creative approach (though one hopes the variations will be more effective because more relevant). It’s really hard to decide, using analytics, whether a folksy or fiery approach is going to be more effective. Or an image of Michelle or an image of the President. After all, I’m assuming that while it’s perfectly possible to measure how popular Michelle Obama is with donors, that the President is pretty popular with them too!
So while I think there is undeniably a role for analytics in pure-creative testing, and an under-served one at that, I also tend to think that you could do a lot of this kind of testing, get real benefit from it, and do very little or no pre-test analytics.
The reason I want to hammer this home is partially to underscore how important I think writing is to testing and optimization. In the email channel particularly but even on the Web, it seems to me that the vast majority of tests must ultimately be about what to say. Yet companies often tell me that the biggest problem they have when it comes to testing is producing creative. I think that’s a really stupid problem to have. How expensive are writers? You could probably pay 100 part time grad students to piece work content for what a single data scientist would cost you. I hear that people make a good living ghost writing term papers for students. I’m sure those people would be just as happy writing emails for Fortune 500 companies. Am I wrong?
What’s the best solution you’ve seen to producing creative content? At scale? And should producing content really be a limiting factor on testing?