As Consumer Reports is not recommending purchase of Surface laptops and tablets, I want to focus on the need for more timely and accurate research and how it can be achieved in consumer and enterprise environments.
In 2015, there was an issue with the Surface line that resulted in painful problems and high return rates. Microsoft was aggressive on a new processor and related technology from Intel and that aggressiveness hurt the company. But, within weeks, the related problems were identified and corrected and the Surface products tend to test toward the top of the stack today, according to J.D. Power.
So, while the products did have problems tied to the rollout of Intel’s Silverlake part in 2015, they appear to test very well now. For the most part, Microsoft seems to be avoiding pounding on Consumer Reports and focusing on talking about the improvements made, but sales will clearly take a hit from this, and given that the problems were corrected, that does seem unfair.
Consumer Reports Problem
Consumer Reports has no anti-vendor agenda but it has two ways to determine and report on quality: through testing or surveys. Microsoft and Tesla have run afoul of these surveys. The surveys have several issues. One is that the sample is self-selecting and largely consists of Consumer Reports subscribers. This means the result, because the methodology doesn’t randomly sample the entire population of product buyers nor even randomly sample Consumer Reports subscribers, isn’t representative of either population.
The bigger issue is that it isn’t timely so it reports on products that are between one and two years old. Often, when the survey is taken, the products have been replaced by newer models, making the results relatively unhelpful. For instance, finding out that a product you bought a year ago was a bad decision really does you no good at all and there is a pretty good chance you already know that.
What you want to know is whether the product you are considering buying is good. If a vendor is behaving badly about fixing products in general, that is useful to know, but even that needs to be timely because if the vendor is doing a good job today, pounding on it for doing a bad job years ago has no real value to the buyer, or the Consumer Reports subscriber, either.
With retailers like Amazon having buyers provide real-time feedback on products, the Consumer Reports survey results are also potentially redundant.
Improving Vendor Quality
While it may be fun to be able to slap big vendors around for problems they’ve had, if the vendor sees this as unfair, you don’t improve quality. You just make the vendor mad. That isn’t moving the ball forward. And Consumer Reports should be, and certainly tries to be, about making buyers more informed and driving better quality. Here, its approach has become largely ineffective.
But unless Consumer Reports can make its surveys more timely, the results are useless and could even be counterproductive. It might be better to partner with something like Amazon to capture more current data and improve Amazon’s own reporting quality. Or pull off fast-changing products as candidates for the survey, and focus on more sustaining things like services, support, and quality trends, pointing out declines in quality over time tied to more current sources.
Wrapping Up: Improving Analytics, Reporting Practices
Quality is important and it needs to be. But those reporting on quality also need to ensure the quality of their reports, the sampling methodology and timeliness. The timeliness issue is the biggest problem because it can make the report nearly useless to buyers and punishes vendors after they’ve fixed problems rather than motivating them to fix problems.
I don’t think Consumer Reports is alone, either. As we’ve moved to analytics as an industry, other reporting firms like Gartner and IDG have failed to make that jump. Increasingly, their reports are out of date and inaccurate. It is likely time that a lot of folks look at how they capture data and update their process and technology to represent the world as it now exists rather than the world that existed when the study was first done. A lot has changed in the last 30 years, particularly in how you capture, interpret and report data. As analysts, we should be embracing analytics as part of our process, not just talking about it.
Rob Enderle is President and Principal Analyst of the Enderle Group, a forward-looking emerging technology advisory firm. With over 30 years’ experience in emerging technologies, he has provided regional and global companies with guidance in how to better target customer needs; create new business opportunities; anticipate technology changes; select vendors and products; and present their products in the best possible light. Rob covers the technology industry broadly. Before founding the Enderle Group, Rob was the Senior Research Fellow for Forrester Research and the Giga Information Group, and held senior positions at IBM and ROLM. Follow Rob on Twitter @enderle, on Facebook and on Google+