Virtualization, one of the fastest growing trends in IT, lets any organization run multiple server functions on a single computer. Virtualization also makes it easier to start up new server emulations quickly. In case of failure, organizations can move server functions to new computers without interrupting service. Beyond saving a lot of money on large sites, virtualization is also moving to PC networks and serves as a foundation for cloud computing. Many myths surround virtualization. Some started when past limitations still existed. Other myths get started and spread by those who have separate agendas or those who have a limited understanding of virtualization. Here are some of the most common myths, identified by Global Knowledge.
Click through for 12 debunked virtualization myths identified by Global Knowledge.
Application myth: The applications we use can’t be virtualized.
In the recent past, when computers and servers had a single CPU, making some applications’ work virtually (like Microsoft SQL or Exchange) was almost impossible. Today, with CPUs built with two, four or more cores; these and other applications are very successfully running virtually.
Virtualization platforms today offer larger hosts at the same cost as yesterday’s systems, which means far more available capacity for virtual workloads. The key to virtualizing more demanding workloads today is simply proper planning and a solid understanding of the technology. If your applications run today on a physical server, they’ll run on a virtual server. Red Hat is one company that guarantees successful physical to virtual migration for applications. Applications rarely connect directly to hardware resources in such a way that would make them unusable on a virtual machine.
The exception that gives this myth some reality is database-heavy applications. Those applications need to be on a virtual system with others that require less intense input/output requirements.
Cost myth: Virtualization is too expensive for us.
Virtualization may seem to be an expensive solution, just like all IT projects, when an organization begins implementing it. When it is examined in the long run, it will pay for itself. The costs saved by using fewer servers (which means less HVAC cooling, less electricity, fewer operating system licenses and reducing maintenance) will give most organizations plenty of reasons for wanting more virtualization. An ROI (Return on Investment) calculation in the beginning of the project will help clarify an organization’s true virtualization cost.
Like all major software implementations, some of the available virtualization solutions cost more than others. The good news is that some of those solutions (and some less costly packages), have fixed and quite predictable subscription pricing. That can make budgeting and cost projections much easier to manage.
Usually, the main reason for virtualization is saving money. It is a matter of spreading the high-end hardware costs over multiple virtual systems. If implemented correctly, virtualization will save money. That includes practical planning and sensible technology, for both hardware and software, selections.
Learning curve myth: Virtualization is too difficult to learn and too complicated to support.
The reverse is much more accurate. Virtualization makes use of the existing skill sets of most organizations’ support staff. Unix, Mac, Linux and Windows services will function very close to the way they do before virtualization. The skills needed to support the existing hardware environment easily convey to a virtual arrangement. Operating systems in a virtual solution perform much like their physical counterparts.
Licensing myth: Virtualization creates licensing problems.
It may seem that using virtualization could let a radical administrator use too many or too few licenses for a particular product. As a result, that would put their organization at risk legally. Virtual solutions are subject to the same restrictions as their physical counterparts. For example, Windows systems perform license activation via the Internet or the system software expires after a trial period and ceases to operate without proper license activation. Separately, license management packages can protect an organization’s liability even without virtualization.
Management complication myth: Virtualization makes system and network management control complicated.
Virtual solutions are easier to manage than physical implementations. A management interface is typically standard with any of the virtual platforms. From that interface, a manager or administrator can view the virtual system consoles, create backups, shutdown or reboot one or more of the virtual systems, change hardware and fully manage each of the different operating systems running in virtual mode.
Overhead myth: Virtualization is just another application function to add to my servers that will slow them down.
There is a kernel of truth to this myth, although it is rarely significant. Early solutions worked with single CPUs and so added some overhead and some slowing. Virtualization providers, like VMware, RightScale, Sun and Microsoft, offer multiple product variations. As hardware platforms improved, so did the virtualization software offerings. The latest virtualization solutions perform so close to the speed of the native hardware that the difference is negligible.
Performance myth: Virtual machines performance less than advertised.
While it is possible that some overeager sales people promised more than they could deliver in the past, this myth seems to be rooted in a lack of understanding. CPUs designed for virtualization, gigabit and 10-gigabit Ethernet, higher and higher performance hard disks and their controllers, all make this clearly myth. In addition, virtualization software has improved to the extent that virtual machine performance now challenges physical machine performance.
Ratio myth: Virtualization only works well when we consolidate more than 10 physical machines into one virtual machine.
This common myth says a virtualization project can only be successful when the ratio (or consolidation number) is higher, such as 10-to-1, or even higher. Organizations may exclude servers whose resource requirements would make the consolidation ratio dip to below 10-to-1. A lower, more realistic consolidation ratio of underused physical systems to suitably sized and used virtual systems is 3-to-1. This can still provide a positive ROI. The benefits of virtualization, such as greatly simplified disaster recovery, provide cost advantages beyond consolidation ratios.
Recovery myth: Virtual machines take longer to backup and recover.
Any good backup and recovery plan depends on the people who are applying the plan, virtual or physical. Recovery or restoration methods for virtual systems offer quick recovery times when compared to even the fastest tape-based systems. Too often, physical systems need a complete operating system reinstallation followed by restoring the applications. These procedures can literally take hours. On the other hand, virtual machine recovery is simple. It only requires replacing any faulty virtual disk files with working files. Restoring service can happen in as little as a few minutes.
Security myth: Virtualization is not secure.
Some negative talk comes from virtualization’s requirement for a host operation system. Those host operating systems are usually configured to bare bones, From there, the organization sets up Secure Shell (SSH) manually. Since the host operating system runs few, if any, network services, it is very secure.
It is also an excellent idea for each organization, regardless of size, to develop its own security standards as well as setting policies and procedures for complying with the internal standards.
Any software can be deemed unsafe. Virtualization is no more secure or less secure than physical server-based solutions. By following industry-standard best practices for configuring operating systems, storage solutions, and networking, any organization can create a secure environment. The U.S. Defense Information Systems Agency provides multiple guides for information assurance.
Server myth: Virtualization is only for servers.
Many organizations and individuals can benefit from desktop virtualization. This offers the benefit of centralizing management, an improved desktop solution and better disaster recovery choices. By using a “thin client” (a limited function system that uses a separate server function) or connection software, users can connect to their own desktop computer system from any Internet connection in the world. Virtualization functions like single disk imaging allow for smaller storage requirements by eliminating the duplicate copies of the same standard desktop configuration.
Size myth: Virtualization is designed only for larger networks.
Virtualization is suited for any organization with two or more servers. Beyond workload consolidation, virtualization has many extra benefits such as: live migration, high availability, fault tolerance and streamlined backups. These features, as well as many others, help any size organization reduce the overall cost of their infrastructure and simplify maintenance as well as management.