One of the biggest IT headaches associated with either public or private cloud computing is the simple fact that end users pretty much become convinced that a virtual machine can be spun up at will, regardless of the impact that event might actually have on everything from network bandwidth to the total cost of IT.
To stay one step ahead of this kind of behavior, it's pretty clear that IT organizations are going to need more sophisticated tools that allow them to easily discover what's been suddenly made part of their IT empire either internally or externally.
To address this issue, ScienceLogic at the Interop 2011 conference this week rolled out version 7.1 of the ScienceLogic EM7 cloud management platform, which adds a Dynamic Component Mapping Facility that automatically discovers not only changes to the internal environment, but also any changes made to virtual machines on Amazon Web Services, GoGrid, RackSpace Hosting and Cloudkick cloud computing services.
To help exemplify the point, ScienceLogic polled 150 Interop attendees and found nearly 70 percent of respondents have deployed or plan to deploy cloud computing. But almost as many said they don't have confidence in any strategy for managing the performance of those cloud computing resources.
ScienceLogic CEO David Link says that it's apparent that the future of enterprise IT is going to revolve around both private and public cloud computing platforms. But in order to be able to manage a new extended enterprise, IT organizations will need to be able to determine what's actually happening in the environment.
Hopefully, one day that information will then be automatically distributed to any number of IT automation tools that, based on compliance policies defined by the IT department, will take appropriate actions. In the meantime, IT organizations are understandably wary of trends such as cloud computing, which provides them with less visibility into resources than ever and, when you really think about it, is a recipe for disaster.