VMTurbo Addresses the ‘CloudOps’ Crisis with New Update

Michael Vizard

Just like when IT operations people complain about new applications being thrown up on production servers with little regard to the actual state of the IT environment, IT organizations are creating instances of cloud computing that don’t take into account the reality of the IT environment. This situation exists not because of spite, but rather because no one has any visibility into the existing state of the IT environment.

To address that specific issue, VMTurbo today released an update to its namesake virtualization management software that adds the ability to deploy new application workloads based on custom or pre-defined templates. According to Derek Slayton, vice president of marketing for VMTurbo, this capability makes use of advanced analytics that have been embedded in the company’s Economic Scheduling Engine to better control resource allocations.

Slayton says this capability is critical in the era of the cloud because instead of creating a theoretically ideal instance of a cloud, IT organizations can actually tune that instance of a cloud to actual available resources. VMTurbo stops short of automating that process, however. It leaves the final decision up to the administrators, but it does provide a series of recommendations to the administrator based on what it discovers about the IT environment.


Many IT organizations wind up having to roll back their first instance of a cloud computing deployment because they have no visibility into the actual environment. VMTurbo, says Slayton, provides that visibility in a way that supersedes a raft of existing IT management tools that provide reams of data, but no actionable intelligence concerning how to better optimize the IT environment.

The latest edition of VMTurbo also adds support for Java applications and Linux environments, as well as hypervisor support for Red Hat Enterprise Virtualization (RHEV) 3.0. As IT environments become more complex to manage, Slayton says administrators are now trying to juggle multiple classes of workloads running on virtual machines that often share the same physical resources. Manually trying to optimize those environments is no longer feasible. Advanced analytics capabilities, however, give IT managers the insights they need to make the best decision based on the business value of any given workload, says Slayton.

When it comes to managing IT in the data center, there has never been a more challenging time. Virtualization brought with it much needed flexibility, but at the cost of increased complexity. Rather than looking at a sea of conflicting data being generated by multiple management tools, administrators need some sound advice they can rely on to not only reduce that complexity, but also make cloud computing within the enterprise a reality.

Add Comment      Leave a comment on this blog post

Post a comment





(Maximum characters: 1200). You have 1200 characters left.



Subscribe to our Newsletters

Sign up now and get the best business technology insights direct to your inbox.