What do virtualization executives think about 2009? A VMBlog.com Series Exclusive.
Contributed by Mike Neil, general manager of virtualization at Microsoft Corp.
Virtualization in 2009
The late American baseball player and manager Casey Stengel once said, “Never make predictions, especially about the future.” This is good advice, but you don’t have to be Malcolm Gladwell to be able to look at the virtualization market today and spot some clear trends, such as:
- the budget pressures that the ailing economy is putting on businesses in nearly all industries,
- the need to reduce energy consumption,
- recent advances in virtualization management that have removed most of the impediments to adoption by even small businesses, and
- the shift of some applications to the Internet cloud.
Together, these trends will make 2009 a tipping point in the transformation of virtualization from a tool for large enterprise server farms to a far more-ubiquitous technology, deployed by small and medium sized businesses as well. Today, only about 12% of servers are virtualized. That may sound like a small percentage but only a year ago it was around 5%. The world is starting to see that virtualization is the way of the future – the way to stop expensive server sprawl and to prevent further damage to the environment.
Let’s examine these factors in a little more detail.
The global financial crisis and accompanying recession have placed businesses and their IT departments under unprecedented pressure to contain costs. Virtualization is well known as a quick – perhaps the quickest -- way to reduce IT costs and become more efficient, both in the datacenter and on desktops. This bottom-line windfall is the result of server consolidation, reduced power consumption, streamlined testing and deployment of new applications, and other economies.
Of course, some vendors’ wares carry a lower total cost of ownership (TCO) than others’. Customers tend to benefit most from virtualization software that runs on the Windows platform, that can manage both physical and virtual servers from one management platform, and that offer interoperability with software and hardware from many vendors.
Ease of Management
The virtualization market is becoming fiercely competitive, and management software is emerging as the key area in which vendors differentiate themselves. That’s because the industry has come to realize that a rich set of management capabilities is critical to unlocking the full value of virtualization.
Most enterprises today have heterogeneous IT environments, with software and hardware from dozens of vendors. That’s why holistic management is so important -- being able to configure, provision, deploy and back up all IT assets from a single console, regardless of whether they are physical or virtual, and regardless of brand. This level of consolidated management will be key to the widespread adoption of virtualization technology in datacenters and on desktops.
WorleyParsons, a global energy services company, estimates that it is saving more than $1 million a year since it virtualized, just in terms of the time savings associated with the server management improvements.
Improvements in the management tools are also broadening the customer base for virtualization. In its early years, the technology had required significant time and specialized expertise to implement and manage, but it’s now becoming practical even for small and medium-size businesses that may only have a handful of servers and no on-site IT professionals.
Reducing dependence on imported oil is just one impetus for the green computing movement, and virtualization takes businesses a giant step in this direction. It’s incredibly wasteful to have each application or service within an organization assigned its own server in the datacenter. CPU utilization rates at many businesses are in the single digits. Consolidating applications through virtualization reduces not only the number of servers needed, but also the amount of power required to run and cool them. In addition, workload- and power-management tools are now becoming available that can move workloads to a smaller number of servers and power down the unused machines.
MLS Property Information Network, a multiple listing service for more than 30,000 real-estate professionals, not only reduced hardware and software acquisition costs, but also their power bill. By consolidating servers and turning off servers during non-peak times that they didn’t need, the company reduced their datacenter’s electrical consumption by 60 percent, a savings of $30,000 annually. It takes about half the power to run six virtual machines on one physical host than six physical machines.
Enabling ‘The Cloud’
Cloud-based computing is emerging as another choice for customers and virtualization is playing a key role in its development. If you think about it, cloud computing is just the logical extension of the trend that began with virtualization’s decoupling of applications from the physical machines they were running on. Once you accept the concept of virtual machines, moving those machines off-premises isn’t a huge leap. What the cloud will deliver will essentially be virtualized services. For example, I’d expect back-up and recovery to be a prominent service offering in 2009, and it’s enabled by more workloads being encapsulated within virtual machines.
We’ll continue to see IT departments in 2009 acting more and more like internal cloud providers. As an example, Lionbridge Technologies, a provider of translation and localization services, recently created a virtual cloud lab using Windows Server 2008 Hyper-V, RemoteApp, Terminal Services Gateway and System Center. They use this internal cloud computing infrastructure to create a more flexible, cost-effective and secure way to build and manage translation testing projects. And in the process of providing more flexible computing to their employees, Lionbridge also reduced hardware costs by 80 percent.
When customers start creating either on-premises dynamic datacenters or dynamic services-based environments, it’s critical that they understand how applications are connected and what they have to do to expand or remove the resources that an application requires. Microsoft is looking at how developers can use modeling to describe an application through its entire life cycle, from the definition of the application and its requirements, through architecture development, deployment and ongoing operation.
Not Just for Servers
Of course, virtualization is no longer just about servers; virtualization at the desktop has become a hot topic, too. This category includes presentation virtualization, application virtualization and hosted virtualized desktops.
There is no ‘one-size-fits-all’ solution for the desktop, and in 2009 IT pros will be able to choose from among several approaches to optimizing desktop computing for their enterprises, including Folder Redirection and Offline Files and Folders, Microsoft Application Virtualization, Terminal Services, Virtual PC 2007, and Windows Server 2008 Hyper-V with System Center management tools. I think we’ll see a lot of organizations use virtualization next year to address backwards compatibility and application compatibility issues on the desktop.
As you can see, 2009 is destined to be a watershed year for the IT industry and virtualization in particular. And here’s a prediction I make with confidence: by the end of the decade, the physical model will no longer dominate. Virtualization will be so ubiquitous in businesses of all sizes that IT managers will have to justify why they aren’t virtualizing, rather than why they are.
About Mike Neil
Mike Neil is general manager of virtualization strategy in the Windows Server® Division at Microsoft Corp. in Redmond, Wash. Neil is focused on the delivery of the Windows® virtualization technology, including the Windows Server virtualization, as part of Windows Server 2008 and Microsoft® Virtual PC 2007. Neil also directs the technical enablement of Microsoft’s broader vision for virtualization, to include virtualization management tools. Before that, he was responsible for Microsoft’s server and PC virtualization efforts since 2003.
Neil joined Microsoft as part of the company’s February 2003 acquisition of Connectix Corp. While at Connectix, Neil was the vice president of engineering and worked on the original team that developed Virtual PC for Mac, allowing Apple Macintosh computers to run Windows. Neil also was one of the founders of Pixo Inc., a small company that developed operating systems and applications for handheld devices, the most notable of which is the Apple iPod.
After attending the University of Michigan, Neil joined Apple Inc., culminating as the integration lead and technical lead for Apple’s operating system project, code-named Copland.