Barry Phillips

Subscribe to Barry Phillips: eMailAlertsEmail Alerts
Get Barry Phillips via: homepageHomepage mobileMobile rssRSS facebookFacebook twitterTwitter linkedinLinkedIn

Related Topics: Cloud Computing, Virtualization Magazine, Desktop Virtualization Journal


Desktop Virtualization: Right Idea – Wrong Tool

Providing the centralized image management that IT needs as well as the real experience of a PC that users demand

Financial analysts, industry analysts, and CIO-focused publications all agree that Desktop Virtualization will be one of the most strategic business initiatives over the next few years. Many organizations have a VDI implementation or project in place solely because of the success they have had with server virtualization. However, Desktop Virtualization has a completely different value proposition than server virtualization. Server virtualization is all about CAPEX (capital expenditure) reduction while Desktop Virtualization is all about OPEX (operation expenditure) reduction. There are three dollars of OPEX spent for every PC acquisition dollar spent.

The whole value proposition is of Desktop Virtualization centralizing desktop images and management in the data center or network operations center. In other words, IT manages one copy of Windows and one copy of each application centrally instead of thousands of copies of Windows and applications on each PC.

Because it was the first Desktop Virtualization technology to market, VDI became synonymous with Desktop Virtualization. VDI was an evolution of server virtualization (in the case of VMware) or application virtualization (in the case of Citrix) to host and run Windows on a VM centrally instead of leveraging the local computational power of a laptop or desktop. VDI is great for a specific use case: users on a high speed LAN that have relatively static images and are using thin client devices. For the 98% of users that are on laptops and desktops (410M PCs shipping in 2010 as compared to 6M thin clients), VDI is simply the wrong tool for the job.

Users want the same or better PC experience as the PCs they are used to having. They want to use multimedia apps and they want to install their own apps, all while needing to operate over a WAN type of connection or disconnected from the network completely. This is especially the case as PCs become more and more powerful (Moore's Law) for essentially the same price.

IT wants to centralize the management of PCs and not control the hardware. This enables IT to act as a service provider and provide a Desktop-as-a-Service offering. Since server virtualization provided massive CAPEX savings, there is a natural assumption that desktop virtualization will provide the same CAPEX savings, while also providing the OPEX savings that occurs when you only have to manage one copy of Windows and one copy of each application (for everyone in an organization).

Here's where the wrong tool for the right job enters the story. While VDI is able to provide centralized management, it does not get down to single image management due to the lack of user personalization. Further, the infrastructure required for VDI has left most IT groups with sticker shock.

With the current density of approximately 50 users per server, the server infrastructure requirements if for 100 servers necessary to handle a VDI implementation for 5000 users. High speed storage and all the networking gear to connect everything together also tends to add up quite quickly too. As you can probably imagine, the ongoing power and cooling costs of that much infrastructure is not cheap either.

That covers the IT side of things, but the most important element here is user experience. Simply put, if the user experience is not better than or equal to the experience they have today, the solution will never succeed outside of pilot mode. More and more information is being delivered via video. Video chat and softphones are becoming more prevalent. Users have a plethora of personal and work applications that may include design and graphical applications that simply do not work well in a VDI environment. Working over a low speed network connection or disconnected from the network is becoming more common and introduces even more challenges in creating a working VDI environment.

Today, nearly all laptops shipping have at least a dual core processor, 4GB of RAM, and plenty of hard drive space. Soon, they will have quad core processors, more RAM, and enough hard disk space for all your photos, movies, and music combined - all for relatively the same price point.

What's needed is a solution that provides IT with the ability to realize OPEX gains by centralizing desktop management, but not have to empty their CAPEX budget in the process. Of course this solution also has to work for the users by enabling them to take advantage of the native performance of a PC and have the ability to work from any location - connected via a network or not. This is really what Desktop Virtualization is all about and why Desktop Virtualization is much bigger than just one of its use cases, VDI.

Assuming that eventually all forms of DV use the same images and VDI solves issues around persistent personalization without storing a copy of Windows and applications for each user, IT departments can centralize all forms of virtual desktops (VDI, laptops, and desktops) and manage one copy of Windows and one copy of each app for the entire organization.

At that point, Desktop Virtualization is really just a better way of doing desktop management. It's better not just because there is single image management instead of managing thousands of different laptops and desktops, but because of all the other benefits that occur when PC images are centrally stored and managed.

Disaster Recovery (DR) is one of those benefits as the primary copy of the PC image is always in the data center or network operations center. Additionally, DR with centralized images means an exact replica of the lost or damaged machine - not just the files. A PC image can even be moved temporarily to a virtual machine so that the user can continue to remain productive while his or her new machine is procured. An exact replica of the original PC image will be placed on the new PC along with any changes the user made while accessing via a virtual machine.

In addition, if the Desktop Virtualization solution integrates image layering into centralized image management, additional benefits or use cases are enabled including in-place Windows 7 migration and help-desk break fix operations.

With in-place Windows 7 migration, IT just assigns a new Windows 7 base image layer to a user and the new layer is sent down to the PC in the background. Once downloaded, the user will receive a message to reboot into Windows 7 and will have their Windows personalization and personal data in place. IT does not have to touch the PC and the user's downtime is only around 30 minutes.

Break-fix is accomplished on the centralized image by replacing the operating system or application layer while leaving all other layers on the PC intact. Since there is no troubleshooting with this procedure, the technician can normally follow a script as part of a "level one" help desk team instead of a Windows troubleshooting expert.

In all of these cases, the image should be able to be run on the endpoint (not on a hypervisor), so users can work offline, use processor-intensive applications, and enjoy predictable, native PC performance regardless of network connectivity.

Desktop Virtualization is really a game-changing technology, but like other early technologies, it is going through some growing pains.

Centralizing image management via layering is definitely the next way of doing desktop management, and when done right it provides great use cases including patch and image management, complete PC back-up and recovery, in-place Windows 7 migration, and centralized break-fix capabilities. This is really what Desktop Virtualization is all about and why it is much bigger than just one of its use cases, VDI.

For Desktop Virtualization to be truly successful, it will need to provide the centralized image management that IT needs as well as the real experience of a PC that users demand - all without breaking the data center budget.

More Stories By Barry Phillips

Barry Phillips is a seasoned Marketing executive with experience in both large and small companies. He joins Maxta after being the CMO of Panzura, Egnyte and Wanova (acquired by VMware), where he led Marketing, Sales, and Business Development. He came to Wanova from Citrix Systems, where he was the Group Vice President and General Manager of the Delivery Center Product Group. He joined Citrix through the acquisition of Net6. He began his career in United States Naval Aviation where he logged over 1,000 hours in a P-3C Orion.

Barry holds a Bachelors of Computer Science from the United States Naval Academy and a Masters of Computer Science from UCLA.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.