With each new round of cloud computing platform releases, there is a sense that the IT industry is heading further into uncharted territory -- one which may spell the end for any number of long-time hardware and software categories, and possibly entire industries.
However, fear of the unknown being what it is, I can't help but wonder whether a lot of these concerns are overblown. I can't cite any actual evidence as yet, considering that the cloud phenomenon is still very new, but having tracked technology trends for close to two decades now, I have reached some general conclusions that I think still hold up under the cloud.
One is that new technologies rarely, by themselves, obliterate old ones. Television did not destroy radio or cinema. Airplanes did not wipe out the railroads. Each of the earlier technologies simply had to shift its business plan to compete in the new environment.
Another is that new technologies are never created out of whole cloth. Rather, technology follows an evolutionary path, building on the knowledge of previous generations to establish a seemingly new way of doing things. The cloud is a perfect example of this.
Take IBM's newest cloud endeavor. When you get right down to it, the CloudBurst portfolio consists of things like a 42U rack, a BladeCenter chassis, a management server, several blades, Fibre Channel attached storage, VMware VirtualCenter, and various Tivoli and Systems Director management stacks. I don't know about you, but that sounds like pretty standard-issue data center fare to me.
According to David Linthicum at Intelligent Enterprise, the CloudBurst portfolio marks IBM's and others' dilemma when it comes to the cloud: that the goal is to reduce hardware and software costs while still getting the same data center services. That they've all hit on the notion that enterprises should experiment with private clouds first before jumping onto Google or Amazon is merely a desperate attempt to keep the train from leaving the station before they can jump into the engineer's chair. Perhaps, but it's also true that whether the cloud is public, private or hybrid, the basic infrastructure will have to reside somewhere, and integrated portfolios like CloudBurst will still have value, provided they can deliver a cost-effective and meaningful solution.
The central question, though, is whether the cloud will eliminate individual data centers in favor of large, regional cloud centers. As Jeff Vance points out on IT Management, clouds are already five to seven times more cost-effective than a traditional data center, a major factor in recessionary times. On the other hand, there is still widespread resistance --- and rightly so -- to entrusting mission-critical data to third parties, so at best external clouds will most likely be used to offload non-critical data and applications.
The very nebulous (hah) nature of the cloud itself makes it difficult to draw any sweeping conclusions on its ultimate role in the IT universe. As Forrester's Frank Gillett points out in this podcast on E-Commerce Times, the cloud is simply an extension of technologies and concepts ranging from virtualization to time-sharing, so in reality we're not dealing with anything that hasn't already been affecting the data center over the past decade or more.
If any change is to come to the data center, whether from clouds or anything else, you can bet your last dollar that it won't come from a technology that is more expensive and less productive than the ones we have now.
The cloud may represent a new way of doing things, but if the IT community -- and that includes the numerous hardware and software vendors -- has proven anything so far, it's that it has the capacity to adapt, and there's no reason to think it can't continue to do so.