A/B and user testing provide an incomplete view of what drives a customer to action
The role of marketing has never been more important to businesses – and marketers have never been more stressed and resource challenged. Website, content, social media, public relations and product marketing all fall under the marketing umbrella, making it the biggest lead generator in the company.
In fact, budget for lead acquisition is often the biggest “discretionary” budget in a company. Yet, according the Hubspot’s State of Inbound report, more than 60 percent of marketers feel generating traffic and leads is the biggest challenge facing the organization – followed by 40 percent of marketings feeling the heat to justify ROI to the rest of the organization.
The cruel truth is that these data points are connected: as leadership teams recognize the enormous opportunity they have enabling marketing teams to drive leads through all of the different touchpoints they have with customers and prospects. And, as those expectations arise, marketers at once relish the role they have in impacting the business and feel the stress in having to deliver constantly.
Unfortunately, despite the changing expectations, marketers are still using the same playbook. Launch a campaign after doing A/B and/or using testing, watch the results, be disheartened – or at best, at a loss as to how to make it better – by the results, and re-launch.
Digital marketers routinely test new pages that fail miserably. No one fails more than weathermen, digital marketers, and hitters in baseball – but it’s not their fault. They have been given an outdated toolkit and, with the pace of business, it’s been tough to adjust and evaluate new ways to do things. But what is the cost in time and dollars for building, approving, and then running A/B tests on a design that shouldn’t see the light of day?
I believe A/B testing and user testing are incredibly valuable – when used for the right purpose. We had one client skip A/B testing altogether? Want to guess how that worked out? I’ll tell you: engagement dropped by 30 percent. Marketers are missing the opportunity to actually understand what drives the visitor and how they perceive the page. Unfortunately, it is too often only done when looking into the rearview mirror with no understanding of why.
User testing as a standalone mechanism for understanding the efficacy of a campaign is not accurate. One SMB digital team we work with learned from user testing that visitors wanted to see more of the details of their service; however, when they made that change to their site, the most important message got lost which, ironically, was how easy their service is to learn and use. Visitors left believing the service to be function rich – so a win for user testing – but too complex (more irony).
Ultimately, user testing is great for identifying roadblocks and visual challenges, but it can’t provide enough statistical data to help digital marketers understand the emotional reasons that visitors don’t convert.
There is a cavern the size of the Grand Canyon between user testing and A/B testing in the Digital Marketer’s toolkit, but there’s an answer.
Pre-live testing – testing a campaign or a web page before it is launched to optimize it – enables marketers to collect extensive feedback on a piece of marketing content before it launches and get customer feedback. It is the perfect complement to other testing methods because, while those are effective post-launch, pre-live testing allows marketers to iterate before putting something out. From day one, marketers are able to realize ROI because the content – campaign, web page, ad, email – has already been tested, validated, and they know why visitors are or are not taking action.
I’ll explore this more in future posts but I’ll leave you with this: imagine a world where marketers can get detailed feedback from their target audience before launching a marketing asset. Who wouldn’t sign up for that?