Liars, Damn Liars and Statistics: Gartner Goofs on Server Numbers

Share it on Twitter  
Share it on Facebook  
Share it on Linked in  

Gartner just made a huge mistake with server market share numbers and changed its estimate that HP led the server market in revenue for Q1 2008. HP and IBM are in a dead heat with regard to server revenue and clearly care more about who is in first than you or I do, unless there is a big gap between the companies.


Last month, IDC and Gartner disagreed as to who was in first, while both placed the gap at well under 1 percent. This kind of thing is important to investors, and Gartner's announcement that HP had moved into first place undoubtedly helped HP's valuation and hurt IBM's, making Gartner's retraction particularly interesting.


However, the accuracy of these numbers even inside corporations (given how deals are accounted for) would suggest that getting within 5 percent of actual sales would be very difficult, let alone having a high level of confidence that under 1 percent actually signified real market leadership.


In this case, HP and IBM are in a dead heat. Neither Gartner nor IDC should be claiming either is in the lead. Currently, I think IDC does a better job of ensuring its numbers are accurate. Gartner's strengths lie in other areas.


Let's talk about numbers.


Why I Don't Do Numbers


I started out working for Dataquest, which later was bought by Gartner and became the source for its numbers. As an analyst working for IBM, I was constantly frustrated with the numbers I received from firms like Dataquest because they weren't particularly accurate, they weren't particularly helpful, and they came with very little analysis (the "so what" part was often missing).


Granted, that last meant I could add a lot of value, but when I joined Dataquest, I figured I'd fix this problem and make the numbers mean something. I'd done a lot of my graduate work on market studies and thought that skill for numbers would be incredibly helpful.


It was almost suicidal.


I created the most accurate actual and 5-year forecast that had to my knowledge ever been released. Almost every single large client Dataquest had wanted me fired. This was because vendors pay for these numbers and don't want to pay for anyone that would even suggest that their numbers would decline. Granted, I became famous overnight and the news services loved me, but they didn't pay the bills.


The vendors are usually the source of the numbers, which can create some interesting problems. While at a tech company, I knew the guy that supplied the numbers to companies like Dataquest and he would get a kick out of messing with them. He didn't uplift the numbers by 1 percent or even 10 percent -- he often reported numbers that were 150 percent to 200 percent over actual, making our unit seem vastly more successful than it was. I have to believe that he likely wasn't the only one.


In that same unit, I was charged with correcting our own internal reporting of the numbers, which had historically been about 20 percent under reported. Even if we wanted to, how could we report accurate numbers if our internal numbers were inaccurate?


(By the way, funny story here, I fixed the inaccuracy only to have a controller uplift the now accurate number by 20 percent, causing us to over report and costing the CFO his job. The controller got a promotion. Politics in any big company can get ugly.)


Finally, I used to be close to Apple. We parted ways when I told them they needed to pay a company like IDC or Gartner to cover them as a server company, otherwise they wouldn't be taken seriously. They then told CNET I was blackmailing them until they realized I didn't work for IDC or Gartner, but they never apologized.


This sense of connecting revenue from a firm to reporting accuracy cuts too close to bad ethics for me to feel comfortable with it. I remain concerned that something like this latest mistake could result in an SEC investigation with dire consequences for all involved.


Net-net, there is no real incentive to do truly accurate numbers. In fact, I would argue the opposite is true, particularly with forecasts, and that creates huge issues if you rely on them.


Take Numbers with a Big Grain of Salt


Because of my experience, I don't trust vendor-sourced numbers. In fact, I don't trust numbers in general unless I can validate the source and assure that it is both independent of bias and reasonably accurate. Anything that is sourced from those that are being measured comes from a self-selecting sample, or where there has been no visible effort to ascertain accuracy, is crap. Product decisions should be based on things you can measure like cost, performance, reliability and value.


If you are a vendor and you aren't paying the firm doing the numbers, you have no clout, so don't be surprised if you don't like what you see. My advice remains the same that I gave to Apple -- if market share is important to you, invest in the firms doing the reporting, otherwise you likely won't be happy with the results of the report.


When reading numbers, assume they are for entertainment only unless you can personally assure their accuracy.