Into the Brave New IT World

Arthur Cole
Slide Show

Cloud Computing Performance Matters

Performance issues are already a major concern.

In the old days (and I mean really old, say, 10 years ago or more) we used antiquated terms to describe IT resources. Words like hardware, software, applications, services, and even vendors, developers and users, all had specific meanings depending on what the item in question did and how it functioned.

Those were simpler times, although those who survived will readily admit that life was difficult.

But at the risk of sounding like my dad-10 miles to school each day, in the freezing snow and blinding rain, with no shoes, uphill, both ways-let me point out that the recent blurring of lines between these once discrete concepts is not necessarily a bad thing. It may have complicated matters somewhat, but in the end we should see a much more vibrant, dynamic enterprise environment capable of things that were inconceivable to those poor wretches of a decade ago.

Our first piece of evidence in support of this premise comes from IBM. The former hardware, then primarily software and services, and now cloud-enabling behemoth, has another next-generation cloud platform ready to go. The SmartCloud portfolio is a mix of hardware, software, services and practices designed to foster the use of public, private and hybrid clouds by breaking down the barriers-economic, technical and otherwise-that have hampered deployment so far. It's basically IBM's way of saying, "Any way you want it, you got it."

At the same time, IBM has launched a new user advocacy group (yes, IBM is a cloud user, too) designed to foster open standards for many of the nettlesome security and interoperability issues that hamper cloud deployments. The Cloud Standards Customer Council, operating under the aegis of the Object Management Group, also counts CA, Rackspace and Software AG as members, along with leading "pure-play" users like Lockheed-Martin and Citigroup. Immediate targets include a series of reference architectures for hybrid clouds, data management and compliance.

On the other side of this coin, we have Facebook. As the leading Web presence on the scene today, it shouldn't be all that surprising that the company has taken a stronger interest in the way its own data ecosystem is structured. To that end, the company has taken a hands-on approach to hardware and software configurations with an aim toward squeezing every last bit of efficiency and flexibility from underlying infrastructure components. That effort has now coalesced into the Open Compute Project, a set of specifications for everything from motherboards to server racks and chassis to full construction designs and practices. It has already lined up big names like Dell, AMD, Intel and HP to start building OCP components for the masses. The user, then, is now taking an active role in the development of the technology.

As I mentioned, there is certainly nothing wrong with greater efficiency and higher performance, particularly at a time when profit margins are just barely starting to surface again after a two-year absence. But it does mean that the old, orderly way of doing things is gone. In the future, data infrastructure will be whatever and wherever it needs to be at a moment's notice, suppliers will be customers and vice versa, and success will depend not on having the latest and greatest systems, but on how well you manage the interactions between your active resources.

Add Comment      Leave a comment on this blog post

Post a comment





(Maximum characters: 1200). You have 1200 characters left.



Subscribe to our Newsletters

Sign up now and get the best business technology insights direct to your inbox.