Protecting Business Applications in a Virtual World

Owen Cole
Some industry experts have claimed that there are still business applications that should not run on virtual servers. This is surprising, as Gartner comments in a research note that 'several interrelated trends are driving the movement toward decreased IT hardware assets using virtualization and cloud-enabled services.

But the adoption of virtualization, though widespread according to multiple surveys, still comprises less than 40 per cent of all servers in the data center today, according to both CDW's Server Virtualization Life Cycle Report, January 2010, and F5 Networks' Trends in Enterprise Virtualization Technologies, 2009, reports.

The need for improved agility and the increasing cost and complexity of IT has driven many businesses into swiftly adopting virtualization technologies. While some virtualization experts claim that virtual computing environments are less secure than physical computing environments, others claim that virtualization can enable better security.

Both claims can be correct, but in reality when information security controls are improperly implemented or neglected in virtual environments, real security risks are exposed. These are the potential pitfalls of virtualization, but the good news is that they don't have to stop the technology being implemented. Another growing trend alongside virtualization is delivering applications via the cloud, as this reduces cost and moves application headaches outside the business. However, this does not come without risk.

Essential protection

The benefits of virtualization are obvious. But every technology implementation needs to be weighed up in terms of the potential challenges and benefits, and virtualization is no different.

Security administrators and those who manage virtualization, predominantly server managers, need to understand phrases such as "hardened operating system," "walled garden," and "network segmentation" in the one-box-for-one-application world, as well as prepare for the new threat arena for distributed and targeted attacks. The need to understand these threats only increases as more elements of the network become virtualized and convergence blurs the boundaries between storage and server networks.

To completely protect a virtual environment, many questions need to be addressed, including:

  • How current analysis, debugging and forensics tools will adapt themselves to virtualization?
  • Which tools will be necessary for security administrators to master between all of the virtualisation platforms?
  • How will patch management impact the virtual infrastructure for visitors, hosts and management subsystems?
  • Will new security tools, such as hardware virtualisation built into CPUs, help protect the hypervisor by moving it out of software?
  • How will security best practices, such as no-exec stacks, make a difference once fully virtualized?

These are all questions that need to be addressed before the enterprise world moves full-on into virtualization. More than anything, we should be thinking today about where virtualization security will take us tomorrow. We all agree that virtualization is here to stay, but those implementing need to make sure they stay ahead of the threats and think about virtualized threat vectors before attackers have already coded for them.

Optimizing Virtual Infrastructures

In addition to security concerns, leveraging the optimization capabilities of modern load balancers-or application delivery controllers -- is another way of making virtual infrastructure more efficient, as it increases virtual machine density and cancels out the impact of virtualization overhead on application capacity and performance. Given that almost every cloud provider utilizes some form of modern load balancing solution and can easily enable these optimization capabilities, it seems unlikely that the lack of virtualization awareness in applications would be detrimental to cloud computing adoption in the long run.

In fact, the ability to leverage both traditional and virtual resources, thus combining the local data center with cloud computing resources, would seem to be a bonus to businesses seeking to address capacity concerns without moving their entire infrastructure to an external entity. The ability to optimize through solutions required to implement a cloud computing infrastructure ensures that organizations moving to an internal cloud deployment are not forced to essentially 'rip and replace' their entire infrastructure to support virtualization-aware applications, but to leverage the virtual infrastructure and assist server managers with the challenges of a virtual world.

(F5 Networks is exhibiting at Infosecurity Europe 2010, the No. 1 industry event in Europe held on 27th-29th April in its new venue Earl's Court, London. The event provides an unrivalled free education programme, exhibitors showcasing new and emerging technologies and offering practical and professional expertise. For further information please visit

Add Comment      Leave a comment on this blog post
Apr 7, 2010 6:04 PM Lisa Dreher Lisa Dreher  says:
Hi Owen, Thanks for the great post- insightful points. I completely agree that "..every technology implementation needs to be weighed up in terms of the potential challenges and benefits, and virtualization is no different." At Logicalis, we emphasize that the key to an optimal virtualized environment is component compatibility and the use of widely recognized standards. The biggest success is when you standardize your hardware platform and your software environment as much as you can -- the same hypervisors, the same underlying hardware and all those pieces. Thanks again for the article; here is another article you might be interested in, too, Reply

Post a comment





(Maximum characters: 1200). You have 1200 characters left.




Subscribe to our Newsletters

Sign up now and get the best business technology insights direct to your inbox.