Time to Come Clean in the Age of the Cloud

Michael Vizard
Slide Show

Private Versus Public Cloud Computing

A plethora of applications are being considered for the cloud, but it may take at least another year before cloud computing goes mainstream in the enterprise.

One of the more interesting and troubling aspects of cloud computing is that it essentially forces IT organizations to start all over again. Business executives, regardless of how much they actually know about cloud computing, are essentially using the term as a club to drive IT costs down. They may not have a clue about what's happening inside the IT department, but they instinctively know that the organization is not getting the biggest bang for the IT dollar possible. They also know for certain that as a percentage of operating expenses, the IT budget is larger than it should be.


Of course, the dirty little secret of IT is that there are a lot of IT executives who don't know exactly what's happening inside their IT departments either. What they do know is that they need to figure that out quickly because it's hard to make a case for building a private cloud unless you can clearly identify how much is currently being spent on what.


To help IT organizations deal with that particular issue, Hewlett-Packard has rolled out a new Discovery and Dependency Mapping (DDM) Service for Data Center Transformation. According to Jimmy Augustine, group product marketing manager for CMDB/APM at HP Software, interest in data center transformation is running high because IT organizations need to not only consolidate data centers to lower costs, they want to completely re-engineer IT management as part of a first step towards building a private cloud. They know that the internal IT organization is competing against external cloud computing providers. If they expect to win that battle, they need to be able to discover all the applications and systems that any given IT service touches across multiple data centers. Augustine says the new HP service is designed to help automate the discovery of how specific services are delivered, which he rightly notes is the first critical step towards finding a more efficient way to deliver them.


The sad truth is that many IT leaders are managing over-provisioned IT infrastructure not because they want to, but rather because the business lacks fundamental control over how those processes were first constructed and later managed. The good news is that the shift to cloud computing creates an opportunity to come clean about how chaotic both the business and the IT infrastructure that supports it have really become.


But as Augustine notes, before any of that can happen, somebody has to first discover what's actually happening today because it's hard to get somewhere when you don't really know where you are to begin with.



Add Comment      Leave a comment on this blog post
Aug 22, 2011 12:38 PM Margaret Dawson Margaret Dawson  says:

Michael - sounds like an interesting report . . .BUT the cloud is not supposed to be a do over.  The whole point is to augment and extend existing infrastructure not build from scratch leveraging cloud computing. No wonder companies are in pain if that's what they think.  Also, this is why companies should not get so focused on private clouds but leverage proven, secure public, hybrid or community clouds that allow them to start small, grow, and NOT rip and replace.  Thanks for keeping the conversation alive.

Reply

Post a comment

 

 

 

 


(Maximum characters: 1200). You have 1200 characters left.

 

null
null

 

Subscribe to our Newsletters

Sign up now and get the best business technology insights direct to your inbox.