Market, Not Analysts, Drives Fairness in Open Source Research


A recent silicon.com posting says "Analysts Fail Open Source."


That reminds me that next up in the queue for me here at IT Business Edge is a review, promised in July 2008, of how IT managers and staff can best use IT research. One thing that article will say is to watch out for analysis with an agenda. Like Joe Friday in Dragnet, it should be "Just the facts, ma'am."


If the recent posting by Martin Brampton had said "Analysts fail IT decision makers with too much feature/function buzzword babble about software," I would agree with him. There should be more holistic case studies that look at the total software life cycle. But I don't think the failure of analysts to adopt that research approach is an open source issue; it is true in terms of analyzing the entire software market.


Take Brampton's headline word for word and you see the problem. (It is designed to get you to read it, of course, so I can't fault him completely.) When he makes such a claim as "Analysts Fail Open Source," he should name names; that is, define "analysts." I assume he means firms and not individuals. But not all so-called analyst firms are created equally. Classic Gartner (not Dataquest) is basically a user-advice house. Classic IDC (not Insights) is basically a statistics-based market research house. Name another firm and I'll explain how it differs from the first two.


Whatever, all the reputable firms have been covering open source for decades.


So, second, Brampton should define "fail." Perhaps the probably few reports he reads from some analyst firms just don't say things about some open source projects that Brampton would like them to say? More likely, he has fallen into the trap of reading what some fringe bloggers say about what some analyst firms say about some open source projects. Clearly, he could not have read or even seen the thousands of pages of analysis about hundreds of open source projects written by all analyst firms.


Third, he needs to consider his use of the term "open source" to understand how analysts look at it. Open source is not a type of product; it is a set of license terms and conditions and a culture. But analysts at reputable firms look at specific types of products (and services) and only if relevant do they look at how the software is written and what its license says. No individual analyst could research the breadth of all open source products and provide quality insight. Instead, the operating system analyst looks equally at Linux and Windows, the middleware analyst looks equally at JBoss and WebSphere, the CRM analyst looks equally at Sugar and Salesforce.com, and so forth.


Most disturbingly, like the Magic Quadrant complainants, Brampton charges

"The general reliance on vendor cash clearly leaves most open source software out in the cold."

But it is the lack of market acceptance of those projects that pushes them to the bottom of the analysts' inbox, not their Ts&Cs or developer communities -- or even the invective of the fringe blogosphere. Lack of market acceptance drives hundreds of non-open-source software products underground as well. For example, I am currently preparing an article on the fastest growing 100 software companies in the U.S. None are household names among analysts and none appear to use open source Ts&Cs (subject to completion of research project). It is interesting that despite the fact that they apparently did not "pay off the analysts," they still are growing by leaps and bounds.


He also says

"But open source projects rarely have significant funding to afford commissioning analysts."

I can only speak for the U.S., but tens of millions are being dropped by VCs on companies that are using open source Ts&Cs and these VCs are insisting on all the same "commissioning of analysts" and other PR activity as they insist on for any software company they invest in.


By the way, if you have any comments about how you deal with IT research firms that I can use in my upcoming article, send them along. I'm especially interested in dirt of course; tell me if your research firm did you wrong and I'll dig into the issue.