Customer Rating and Review Sites: An Upcoming Crisis of Confidence?


Share on LinkedIn

My colleague Dr. David Ensing was quoted in Automotive News yesterday re. a study he did on customer review sites. This is a hot topic for many automotive manufacturers. The link to the article is below as is Dave’s overview of the study.


Now more than ever online ratings sites are becoming an important part of the customer experience. When’s the last time you searched for a review about a product you were thinking of buying, a hotel where you might be staying, or a restaurant where you might be eating? Should you believe what you read?

Given the prevalence of online reviews, companies have been paying more attention to their performance and their retailers’ performance at customer rating and reviews sites such as Yelp, Trip Advisor, Urban Spoon, Google+, Dealer Rater and a plethora of others. Many companies want to gather information about what customers are saying about their products and retailers at these sites for business analysis purposes. Some want to post reviews from their customer experience programs to websites as part of marketing programs. Others hire social media reputation management firms in an attempt to make sure they are portrayed positively at these sites.

Maritz Research wanted to get a better perspective on how many customers use these sites, what they do there, and how they feel about the information presented there. This is the first of a three-part series about what we learned and focuses on the surprising lack of trust of the information on these sites. These findings are from an online panel study we conducted asking 3,404 people about their use of dedicated customer review and rating sites such as Yelp, Trip Advisor, and others.

One in Four Thought Review Site Information is Generally Unfair

We asked rating site users to rate the integrity of information at rating sites. Nearly 25% thought the information presented is unfair, with 16% saying it is overly negative and 9% saying it is overly positive. The remaining 75% thought the information presented at rating sites is generally fair. Interestingly though, even these respondents pointed out that they need to separate trustworthy reviews from non-trustworthy ones – usually due to concerns that some reviews are naturally biased or fake. For example:

  • I can usually tell when a review is fake – if it is overly positive. I usually try to read a range of reviews. If there are not enough reviews, I don’t usually take it seriously. I don’t think websites can prevent all companies from posting fake reviews, but by reading the various reviews, I can usually reach an accurate conclusion.
  • I’m not sure how objective it is. I think that people tend to only share either the good or the bad which usually does not give you a real balanced opinion.
  • I think the reviews are for the most part honest, but I believe that the site is selective in the reviews it posts.
  • Some are credible some are biased or questionable
  • Generally reliable if one disregards the top 5% and the bottom 5-10% of comments.
  • Take everything with a little skepticism but if enough people say it then more believable.
  • I believe that a lot of the information is put there by people who work for the companies to show positive information.

Another indication of skepticism is that most rating site users (60%) say they paid more attention to the actual customer comments rather than to the numerical or star ratings reviewers provided. This compares to 11% who say they pay more attention to the numerical ratings and 29% who pay attention to both equally. From the comments our rating site users made about their impressions of information at rating sites, it appears they pay more attention to the reviews (versus ratings) because they are trying to determine if the review is credible and if it applies to their situation. Here are some examples:

  • Can be useful but often the text makes me think their review is not based on the things I think are important. Sometimes seems too much like an ad (like they were paid) and sometimes is just not the aspects I find relevant. But then sometimes it is good.
  • Sometimes reviews are exaggerated both ways, positively and negatively.
  • You have to read between the lines to see what the reviewer is really saying some times. I discount reviews that I feel are unfair, and give more credit to reviewers that give great detail and reasons for their reviews.
  • Have to try to determine if review is legit or if the writer has an axe to grind, out for revenge.
  • I think the text of the reviews tells you a lot about whether the reviewer is trustworthy

We also found a couple of interesting demographic differences in relation to respondents’ trust of information at rating sites. Men were significantly more skeptical of rating site information with 73% saying they thought the information was generally a fair representation of customers’ experiences compared to 78% for women. Younger rating site users were also significantly more skeptical of the information presented at rating sites than older users. Below are the percentages of users by age group that feel rating site information is generally a fair representation of customers’ experiences.

2013 10 01 - Review Sites 1

Trust of Information is Generally Low Across Review Sites

We also asked respondents to tell us how much they trust information at 13 high-profile rating sites using a 5-point rating scale consisting of: “Trust None of it”, “Trust Some of It”, “Trust about Half of It”, “Trust Most of It”, and “Trust All of It”. Percentages of respondents who said they trust most or all the information at the various sites ranged from 36% to 59%. Larger and more established sites seemed to be perceived as more trustworthy than newer and smaller sites, but even at these sites more than a third of visitors are skeptical about much of the information.2013 10 01 - Review Sites 2

*Low sample size due to low number of rating site visitors


What does this mean for marketers and market researchers who want to use review site information? Most users recognize that much of the information needs to be taken “with a grain of salt.” Therefore, for business purposes this information should be used to monitor what consumers are seeing when they find reviews on a company or retailer but in most cases it should not be used as a measure of actual performance.

To address the trust issue companies and rating sites need to consider new ways to provide consumers with reviews that have been verified to be actual customers and that have not been filtered for positivity. One way to do this would be to work with companies to post feedback gathered in well-sampled and less manipulated customer experience programs. These reviews could be labeled as “certified” because they would be known to be from real customers that had an actual customer experience. Of course, to make the information credible companies would need to allow the posting of all reviews (given the customer’s permission of course), not just positive reviews.

Republished with author's permission from original post.

Chris Travell
Chris Travell is VP, Strategic Consulting for the Automotive Group of Maritz Research. He is responsible for working with Maritz' Insight Teams to further the understanding and application of the firm's automotive research. He has appeared on numerous television programs and is often quoted in Automotive News, Time, USA Today, Edmunds, Detroit Free Press, The Globe and Mail and various other publications in regard to issues related to the North American automotive industry. He is the principal contributor to The Ride Blog, Maritz Research's automotive blog.


Please use comments to add value to the discussion. Maximum one link to an educational blog post or article. We will NOT PUBLISH brief comments like "good post," comments that mainly promote links, or comments with links to companies, products, or services.

Please enter your comment!
Please enter your name here