Dear [First Name],
A friend of mine who is building one of those "wisdom of the crowd" software review sites tells me her research shows that what buyers want most is "apples to apples" comparisons of product features.
Duh.
I'll spare you my rant on why crowd sourced recommendations are a bad idea (hint: when you're sick, do you go to a doctor or ask a bunch of random strangers which treatments worked for them?) Suffice it to say that Raab Associates' B2B Marketing Automation Vendor Selection Tool (VEST) is written by a professional analyst (me) who has assembled 200 rigorously defined points of comparison on 25 marketing automation systems, allowing buyers to quickly find vendors who meet their needs. At a time when the marketing automation industry is more confusing than ever -- and when 25% of marketing automation buyers are unhappy with their results*-- it's essential to have solid, detailed information to make a sound decision.
That’s not terrible, at least by my pitifully low personal standards. But it did leave me feeling a bit uncomfortable about bashing the crowd-sourced software review sites. The problem was the doctor analogy: although it does express my fundamental objection accurately, it doesn’t tell quite the whole story. It’s true that random strangers can’t accurately diagnose you or prescribe a treatment. But random strangers can indeed provide useful information about whether a doctor is good to deal with and how well they their recommendations worked out. Similarly, crowd-sourced sites can provide valid information on how easy it is to use a piece of software and how well the company does at customer support. This is much closer with the kind of information you’d get from consumer view sites like Yelp. Customers don’t need to be technical experts to tell you whether they’re happy.
You’ll note that I haven’t criticized the crowd-sourced sites for the typical review site problems of fake reviews and biased reviewers. Companies like g2crowd (my main point of reference here, although not my friend’s business) do a reasonably good job at controlling for these by requiring users to verify their identity by logging in through LinkedIn. Of course, smart vendors will still game the system by encouraging satisfied users to post reviews, so the relative rankings will reflect the vendors’ marketing skills at least as much as their actual product quality. There’s nothing unethical about that, but it does undermine the notion that the resulting ranking accurately reflect the relative quality of the products and not just the relative skills of each company’s marketers.
On the other hand, good crowd sourcing sites let users see reviews from companies similar to their own in terms of size, industry, etc., and ask specific enough questions to get meaningful answers. And even the general comments they gather are somewhat useful as indicators of what a (highly biased) sample of users think.
But, now that I’m on the subject, I’ll let you know what I really think: which is that feature comparisons, whether prepared by a not-so-wise crowd or a professional analyst like Yours Truly, are not really the problem. What stops marketers from choosing the right software isn’t a lack of information about product features. It's a lack of understanding which features each marketer needs. Figuring out their feature needs requires crossing the gap between their business objectives, which most marketers do understand, and the features needed to support those objectives, which most marketers do not. Making that translation is where industry experts really add value, even more than in familiarity with the details of individual products. I’ve recently been working on a very interesting project to close that gap…but that’s a topic for another day.