Why VDI?

Martin Ingram

With the technology of desktop virtualization fast maturing, the question "Why VDI?" ceases to be one of technology and becomes a business decision.

As with many new technologies, the early days of desktop virtualization were primarily concerned with future capabilities, and actual deployments were limited in both size and use case. The products have now advanced to the point where many of the "futures" are now here and we are in a position to deploy to broad groups of users. However, the newer techniques achieve business benefit in a completely different way than the early implementations, and it is critical to understand the differences if real business benefit is to be achieved.

For most organizations, the objective of desktop virtualization is to create a more manageable client computing platform and so address the problems of high support and management costs of traditional PCs.

The management problems of the PC stem from its "personal" nature. While we may deploy a gold image to new machines during hardware refresh, and seek to keep them up to date with software deployment systems, the reality is that once users start using a machine, they quickly make the machine unique. This makes the machine difficult and expensive to support, because each machine has to be supported individually, making it impossible to achieve economies of scale across the user base. This fundamental problem is what has held back the management of PCs from the beginning and is one which we are only now on the verge of solving.

Early VDI deployments did not attempt to challenge this basic problem with PCs-they just centralized the software images. This was good for serving remote users where the cost of local support was prohibitive but was not a general solution. It failed to solve the core problem of PC management: uniqueness.

If we could keep the PC image identical for all users and also keep it up to date, the problems of "uniqueness" go away. There have been attempts in the past to do this through PC lockdown-users were prevented from changing anything on the machine and allowed to run a limited set of authorized applications. This meant the software components could be kept consistent. However, the loss of user flexibility and the cultural problems in taking away users' control over their work environment severely limited the success of lockdown.

What if we could keep the core PC components identical and up to date, but let users have the degree of flexibility that they expect and need to be productive? This is the aim of today's generation of VDI. The key to this approach is to split the PC image into a number of separate software components that can then be standardized and managed independently. These components are dynamically assembled each time the user logs on to guarantee that users are always running the latest version of each component. Typically, the components split into three categories: operating system, applications and user environment.

The operating system component includes the operating system plus, perhaps, a small number of widely used applications. It is configured to a "corporate generic" level that will be adapted for each user as he or she logs on. Applications are kept separate to minimize conflicts and are typically delivered through application virtualization, although some are delivered from existing application servers, either terminal services or browser-based.

The user environment comprises all data on the PC that is associated with a user and is unique to the user and, hence, includes a wide range of different data types. Examples include policy settings to configure the operating system dependent on the user and their role, as well as personalization settings to ensure that the user gets a familiar experience. In this way, the PC image remains completely standardized and all the unique aspects are managed independent of the image. This gives the level of standardization we need to better manage the platform without compromising the user experience of the PC.

This "componentized" model is the one that achieves the benefits we are looking for in that it reduces the costs to manage the platform because the greater consistency and control over the software components leads to fewer support calls and quicker fix times.

As you evaluate desktop virtualization models, the key factors that you need to see in the solution are its ability to deliver standardized operating system and application components so that you can achieve economies of scale across the user base for all the common components, plus user environment management to deliver the user-specific aspects of the desktop. This combination of techniques is the way to achieve cost savings for IT while still giving users a familiar and productive working environment.

Add Comment      Leave a comment on this blog post
Nov 16, 2009 6:11 PM Nanouk Miller Nanouk Miller  says:
Whether its VDI or PCoIP (PC over the Internet) the protocol will not matter as long as its not a fractured environment as the VSI (virtualized server infrastructure) has become with the market split between VMWare, Citrix, Sun, IBM, HP, and Microsoft. As the server side sorts out the winners and losers over time, the desktop can hurry this process by not becoming as fractured. Regardless of if you like Microsoft, the new Windows 7 "Windows XP mode" will be the first introduction that many will have towards virtualization. How the players work together or against will determine the near term chaos of the desktop. Reply

Post a comment





(Maximum characters: 1200). You have 1200 characters left.



Subscribe to our Newsletters

Sign up now and get the best business technology insights direct to your inbox.