The Fight to Eliminate the Scourge of Fake Online Reviews

Don Tennant

Don’t you wish somebody would come up with a really effective way to ensure that the reviews you read online are legitimate, and they’re not being posted by the business being reviewed, an unscrupulous competitor, or a random 15-year-old kid with way too much time on his hands? Well, it appears that somebody has.

Mindshare Technologies, a Salt Lake City company that’s a relatively high-profile player in the relatively low-profile world of voice of the customer (VoC) technology and services, last week launched OpenTell, an online platform that  serves as a trusted third party to certify customer reviews. The idea behind it is fairly simple: If you want to read a few online reviews of the experiences of customers at the local outlet of, say, a particular restaurant chain before you head over for lunch, it would be nice to know that those reviews weren’t actually posted by the assistant manager, or by somebody from the competing chain across the street.

If the chain is a Mindshare client that has signed up for the OpenTell service, only those people who have actually been to the place and purchased something can post a review—there’s a one-time-use code on the receipt that the person needs in order to post it. I spoke about all of this with Mindshare CEO John Sperry earlier this week, and he explained his strategy:

We’ve been in the business of collecting customer review data for 11 years, and delivering it  in real time to store managers. Right now, we have over 300 million customer reviews in our database, and we’ll collect 85 million customer reviews this year alone. We had the data, but it was private—we had to make it transparently seen throughout the world. We approached our clients and said, and this was a bold statement to them, “What we want you to do is we want you to go through and make public all of the reviews. We want to publish them for you.” As a trusted third party, we certify that each review comes from someone who is already a customer.


Sperry cited the case of Amazon to explain the trend that’s under way:

Amazon two years ago flushed all of their reviews except for their book reviews, which were the only ones they could certify were from legitimate customers. They went to this ecosystem where after you make a purchase, they send a link out to you for you to publish a review, and only you can publish that review. That’s exactly what we do, except we do 350 brands. So we do for Hertz what Amazon does for itself. Expedia just did this about 18 months ago, as well. So the trend is there—people are tired of fraudulent reviews and bad data.

With the launch of OpenTell, Mindshare has, as you might have already surmised, become a competitor of Yelp. So Sperry couldn’t resist doing a little Yelp-bashing:

We’ve been prototyping [OpenTell] and had it in beta for about a year now, and we have about 20 brands on it, representing probably around 20,000 [locations]. Our goal is to have 40 brands on here by the end of Q1, representing as many as 100,000 [locations], since some of our larger brands have a large number of [locations]. This is a change in the ecosystem, in the way things have been done, because right now, anyone can post a review on these other review sites—you don’t even know if they’re a real customer. Yelp is the first one to admit that 20 percent to 25 percent of their reviews are fraudulent. The funny thing is, that was a study [by Harvard Business School] done externally from Yelp, and that’s 20 to 25 percent of the reviews that went through the filter.

I asked Yelp senior PR manager Kristen Whisenand about that in an email. She responded that a clarification is in order:

Our recommendation software goes through the more than 47 million reviews that have been submitted to Yelp to select the most useful and reliable ones. Our stance is quality over quantity and we only recommend about 75% of the reviews that are submitted. This means about 25% of the reviews *submitted* to Yelp are not published on a business’s listing or recommended to consumers.

There are a number of reasons why a review might not be recommended: the review might have been posted by a less established user, or it may seem like an unhelpful rant or rave. Some of these reviews are fakes (like the ones we see originating from the same computer) and some suggest a bias (like the ones written by a friend of the business owner), but many are real reviews from real customers who we just don’t know much about and therefore can’t recommend.

I know there was some confusion stemming from a recent Harvard Business School study, but it's important to note that Yelp's recommendation software had already identified the reviews that the Harvard study found likely to be fakes submitted by businesses themselves.

In any case, it’s clear that supplanting Yelp as a go-to resource for reviews won’t be easy. Sperry said the trick will be to get the search engines on board:

What we need to have is acknowledgement among the search engines that we have the best data—that we have certified customer data. Certified customer data is distinctively more valuable than just anybody posting data. That’s the argument we need to win, that’s what we’re trying to get out there right now. I can tell you, the writing is on the wall when it comes to open sites that just let anybody publish—it’s going to change over to where more people are boutique- or vertical-specific, and that’s where the search engines are going to start pulling from. In the end, good data wins.



Add Comment      Leave a comment on this blog post
Nov 24, 2013 11:30 AM Mark Sadler Mark Sadler  says:
The 'trouble' cased bu open review sites such as TripAdvisor and Yelp are well documented, I have even read a white paper by the Office of Fair Trading in the UK which highlights the negative points. Recently we had the case in New York where 19 companies who sell fake reviews were caught out and fined $350K for their deception. Fake and filtered or gamed reviews do upset a market and do cause harm to the customers. It is wrong and I'm surprised that google have allowed it to go on for so long. The seo benefit of reviews has dropped off, but is still significant. I believe that spinning fake reviews is the same as spinning articles - it's black hat! In my niche we have a very small number of sites like ours that have a serious moderation policy and allow both positive and negative reviews to be published on our sites, but mostly the other websites only post the positive and give that some twisted credibility in the process. We have partnered with Reevoo who are a known trust mark for reviews and will bring them into social care alongside our own reviews. Review sites are at a pivotal point and as you suggest it's just a matter of time. I'm waiting for the next Google update. Reply
Nov 24, 2013 7:12 PM Stylish Stylish  says:
These legitimate comments have been deemed illegitimate by a legitimizing agencies legitimize BOT. Reply
Nov 25, 2013 1:33 AM Frank Frank  says:
there’s a one-time-use code on the receipt that the person needs in order to post That number now has VALUE. Dishonest customers can sell it to someone else who has an agenda to post. Reply
Nov 25, 2013 7:46 AM Stylish Stylish  says: in response to Frank
Great observation Frank, do you want to collaborate on a site to establish a marketplace for those codes? Reply

Post a comment

 

 

 

 


(Maximum characters: 1200). You have 1200 characters left.

 

null
null

 

Subscribe to our Newsletters

Sign up now and get the best business technology insights direct to your inbox.