I was on Amazon earlier this week and noticed that some people appear to be working overtime in an effort to keep people from reading the new book “Haunted Empire: Apple After Steve Jobs,” which is about Apple post-Steve Jobs. When the book first came out, I was surprised to see five reviews on Amazon, all indicating that the book was horrid. (Since then, however, reviews have clearly trended more positive.) When I looked at the people who wrote the reviews, though, they seemed to either have never reviewed a book before or had never reviewed anything before, which was really suspicious.
My experience with Amazon book reviews dovetailed with concerns I’d heard about on Yelp, where folks have discovered that a large percentage of the reviews are from people who aren’t legitimate. And both are good examples to support the theory I read in an article on Forbes that suggests that enterprises can’t trust software review websites.
The Forbes article resonated with me, because I know that getting IT folks to write tech reviews is really difficult. For one, IT staff usually doesn’t have the time, plus IT skills and writing skills don’t often go hand-in-hand.
The problem with Yelp that the firm is desperately trying to correct is that too many reviewers fall into three categories.
- Company shills, which range from employees to paid contributors who say nice things about the product or service but have never actually used it
- Company enemies, which can include disgruntled employees, employees of competitors or unreasonable customers
- Trolls, or people who just love to go on the Web and cause folks pain because they can
Yelp is trying really hard to better ensure the quality of reviews, but that has proven to be a very difficult task. The sad thing is that the real reviewers don’t have the writing skills of the fake ones, so often the best “read” is from a person who isn’t honest. Even after a significant effort to improve review quality, a recent report indicates that false Yelp reviews have jumped from 5 percent to 20 percent and this is probably an understatement, because it is very likely that some of the false reviews haven’t been caught.
Enterprise Review Websites
If this is bad news for Yelp, it is horrid for sites trying to provide similar review services for enterprise products. Unlike consumer services and products, where the use case is generally easy and comparatively similar, enterprise deployments are neither. For instance, you can have a great experience with a vendor or product when the vendor puts their best people on it and partially funds the deployment in order to get a customer blessing or to work out an early adopter problem. In addition, products are deployed very differently based on unique requirements of each company. Plans are dependent upon what the company needs to have integrated and also the training and quality of the team doing the deployment.
In short, even if you could get an honest review, you will find that your experience is very different from the reviews you’ve read unless your environment is nearly identical to the one you read about.
I do a lot of interviews with IT folks, but one stands out. It was years ago, during an event called Scalability Day, which was held to showcase that Microsoft’s Windows NT could scale to mainframe loads. The company that I called raved about the offering; however, under protracted questioning, they admitted that if they, rather than Microsoft, had paid for the deployment, it would have cost them three times what a mainframe would have cost. The product worked and the review was extremely positive and authentic, but the result had no relation to what would happen if another company had tried to emulate what had been done.
What I’ve found over the years is that the success of a product or service often has more to do with the quality of the vendor team, the readiness of the IT shop to accept the technology, the business type, and the experience of the IT team doing the deployment. These often trump the quality of the product or service, and yet these points aren’t typically captured in a review at a level that is granular enough for another company to emulate.
Wrapping Up: Compare Like to Like
The best solution to have true, online reviews for IT products would likely be to ensure the independence and knowledge of the reviewer and also to group reviews by geography, business type and technology mix. At least then you could get more of an apples-to-apples view, but you might not get enough data points to put your mind at ease. You could also, if your enterprise was large enough, engage other folks in the company that use products like the one you are researching. For products and services that are used widely, this process would provide validated reviews that would apply to your industry and business and that reflect better what your experience would likely be.
In the end, though, it will be difficult for consumer services of online review sites to maintain adequate quality. In fact, it will likely be impossible for a company to provide enterprise-level product reviews online without some kind of analytics tool that ensures quality, accuracy and applicability.