Rating the Crowd-Sourced Marketing Software Review Sites


Share on LinkedIn

What began as a whimsical “landscape of landscapes” led me to realize crowd-sourced review sites are the most common type of vendor directory, accounting for 15 of the 23 sources listed in my original graphic. This begged for a deeper look at the review sites to understand how they differ which, if any, could replace the work of professional reviewers (like me) and software guides (like my VEST report).

The first question was which sites draw a big enough crowd to be useful. I used Alexa traffic rankings, which are far from perfect but good enough for this sort of project. (Compete.com gave similar rankings except that TrustRadius came in lower, although still in the top 10.)  After adding two review sites that I learned about after the original post, I had 17 to consider. In order of their Alexa rankings, they were:

Since crowd wisdom without a crowd can’t be terribly effective, I limited further analysis to the top 10 sites. Of these, AlternativeTo.net, SocialCompare, and Cloudswave were different enough from the standard model that it made sense to exclude them. This left seven sites worth a closer look.

The next question was coverage by the sites of marketing technology. Every site except TrustRadius covered a broad range of business software from accounting to human resources to supply chain as well as CRM and marketing. TrustRadius was more focused on customer-related systems although it still had business intelligence and accounting. The numbers of categories, subcategories, and marketing subcategories all differed widely but didn’t seem terribly significant, apart from SoftwareInsider and DiscoverCloud looking a bit thin. Differences in the numbers of products in the main marketing categories also didn’t seem meaningful  – although they do illustrate how many products there are, in case anyone needs reminding.

What did look interesting was the number of ratings and/or reviews for specific products. I sampled leading marketing automation vendors for different sized companies. It turns out that G2Crowd and TrustRadius had consistently huge leads over the others. I didn’t check similar statistics for other software categories, but this is probably the one that counts for most marketers.

Of course, quality matters as well as quantity. In fact, it probably matters more: my primary objection to crowd-sourced software reviews has always been that users’ needs for software are so varied that simple voting based on user satisfaction isn’t a useful indication of how a system will for any particular buyer.  This is different from things like restaurants, hotels, and plumbers, where most buyers want roughly the same thing.

Software review sites address this problem by gathering more detail about both the products and the reviewers. Detailed product information includes separate numeric ratings on topics such as ease of use, value for money, and customer support; detailed ratings on specific features; and open-ended questions about what reviewers liked most and least, how they used the system, and what they’re recommend to others. Reviewer information on all sites except Software Advice starts with verifying that the user is a real person through requiring a LinkedIn log-in. This lets the review site check the reviewer’s name, title, company, and industry, although these are not always fully displayed. Some sites verify that the reviewer actually uses the product. Some provide other background about the reviewer’s activities on the review site and how their work has been rated.

I can’t show how each vendor handles each of those items without going into excruciating detail. But the following table gives a sense of how much information each site collects. Of course, reviewers don’t necessarily answer all these questions. (Caution: this information is based on a relatively quick scan of each site, so I’ve probably missed some details. If you spot any errors, let me know and I’ll correct them.)  When it comes to depth, TrustRadius and DiscoverCloud stand out, although I was also impressed by the feature details and actual pricing information in G2Crowd.

The number and depth of reviews are clearly the most important attributes of review sites.  But they also differ in other ways.  Selection tools to identify suitable vendors are remarkably varied – in fact, the only filter shared by all sites is users’ company size. Industry is a close second (missing only in DiscoverCloud), while even selections based on ratings are found in just four of the seven sites. Only three sites let users select based on the presence of specific features, an option I believe is extremely important.

Looking beyond selection tools: most sites supplement the reviews with industry reports, buyer guides, comparison grids, and similar information to help users make choices. Several sites let users ask questions to other members.

So, back to my original question: can crowd-sourced review sites replace professional software reviews? I still don’t think so: the coherent evaluation of a practiced reviewer isn’t available in the brief comments provided by users, even if those comments are accompanied by information about specific product features. This may sound like self-serving mumbo-jumbo, but I do think a professional reviewer can articulate the essence of many products more effectively than users who report only on their personal experience. (Yes, I really just wrote “articulate the essence”.)

But whether sites can replace professional reviewers is really the wrong question.  What matters is the value the review sites offer on their own. I’d say that is considerable: given enough volume, they indicate the rough market share of different products, the types of users who buy each system, and what worked well or poorly for different user types. User comments give a sense of what each writer found important and how they reached their judgements.  This in turn lets readers assess whether that reviewer’s needs were similar to their own. Buyers still need to understand their own requirements, but that’s something that no type of review can replace.

Republished with author's permission from original post.


  1. Hi David, very timely article as I was just setting out to do similar research!

    Curious, where do you feel Capterra falls within this matrix? They seem to be a hybrid between a reviews site (although not necessarily in-depth), referral site and directory with buyer’s guides.

    I ask because, at least for the software industry I am working with, they show up frequently in the SERPS or in Adwords ads for the keywords that are most relevant to my industry. Obviously, this drives more traffic to their listings – a good thing for those who have them on their site. Software Advice seems to as well. But, I never see anything from G2 Crowd or Trust Radius – leading me to wonder, where does their traffic come from? How do people looking for our software find our listings on their sites?

    Thanks again for the article. It’s been helpful.

  2. Hi Jennifer. Capterra really should have been included in my analysis. In the past I’ve seen them primarily as a directory but they do have some ratings and reviews. They would not have rated very well in terms of traffic (Alexa rank of 4,555 puts them at 8th on the list), review volume, or detal provided (only 3 ratings per vendor, only general reviews, and little information on the reviewers). They do allow some filtering based on features in the product.

    I don’t have any particular insight into the traffic sources of the vendors. I’d guess it’s a combination of organic and paid search. As an advertiser, you’d want to look at the quality as well as quantity of the traffic from each site. I’d like to think that the more detailed sites result in better qualified traffic, but the world doesn’t necessarily work that way so I wouldn’t assume that is correct.

  3. We Suggest Software is also one of the best business software reviews site that has reviews in more than 300 categories.


Please use comments to add value to the discussion. Maximum one link to an educational blog post or article. We will NOT PUBLISH brief comments like "good post," comments that mainly promote links, or comments with links to companies, products, or services.

Please enter your comment!
Please enter your name here