Virtualization Technology News and Information
Server Virtualization Poised to Take Off

Quoting from Bio IT World

Increased demands to keep IT costs contained and the availability of new technology are leading many organizations to consider server consolidation and virtualization projects. 

In fact, IDC estimates that more than three-quarters of all companies with 500 or more employees are already deploying virtual servers where software is used to partition a single server (usually a 2- or 4-way x86-based server) to run multiple operating systems and applications on virtual machines.

The benefits are so significant that those who have such projects underway say that 45 percent of new servers purchased this year will be virtualized, according to IDC.

Virtualization has been around for years – particularly in mainframe and mini-computer environments. However, it has not been of much interest in the Windows and Linux server communities. So, why the sudden interest now?

One great factor is server sprawl. In most companies, the common approach when deploying a new Windows or Linux application is to use a new server for each new application. The reason for this approach is simple: There are no conflicts between applications.

Additionally, for years companies have deployed one application to one server for reliability. If multiple applications run on a single server and one hangs, in the past it often required that the entire server be rebooted, thus necessitating the shutting down of the other well-behaved programs. (This isn’t as much of a problem with today’s server operating systems, but old habits die hard.)

The problem with the one-server-per-application approach is that servers are often underutilized. For example, an April 2006 article in Baseline magazine noted that before Welch Foods undertook a large server consolidation and virtualization project, its servers were running, on average, at 10 percent CPU utilization. A major New Jersey pharmaceutical company (that does not want to be named) says it has similar utilization rates on hundreds of its servers. Similarly low utilization rates (in the teens percentage-wise) are also common for memory utilization.

Obviously, this means lots of expensive compute capacity is sitting idle. But even worse, the cost for IT staff labor, software license updates, warranties, and service contracts to maintain a server over its lifetime is often three to four times the initial acquisition cost. That means a $2,500 server might cost an additional $7,500 to maintain and update over its lifetime.

If a company physically consolidated 10 servers each running at 10 percent or lower utilization onto a single server, that $7,500 would only have to be spent on one server versus 10 (a cost savings of $67,500). This example is an extreme, since a company would not combine 10 servers each running at 10 percent onto one server. There would need to be spare capacity to accommodate fluctuations. But even if, say, five such servers were consolidated onto one, the cost savings are still significant.

Virtualization provides additional cost savings. Virtualization software lets an IT manager create multiple virtual machines that run on a single physical server. Each virtual machine can run a different operating system and application. And the virtual machines are independent of each other so there are no conflicts between applications for server resources. This allows for the quick setup, deployment, and tearing down of new applications, thus saving IT staff time. For example, several recent articles in IT trade publications noted that IT staff can bring new virtualized servers online in 15 to 20 minutes.

The cost and time savings from combining consolidation and virtualization are getting the attention of IT managers. Industry analysts cite the reduced management costs afforded by physical consolidation of discrete servers, and the ease with which virtualization allows new applications to be deployed and administered, as the impetus for companies to rapidly convert their server infrastructures.

Specifically, IDC notes: “The notion of reducing complexity is driving a more widespread adoption that will take place over the next one to two years, not the five to ten year gradual market shift as in other technology areas.”

Read the original article, here.


Published Thursday, June 29, 2006 6:42 AM by David Marshall
Filed under:
There are no comments for this post.
To post a comment, you must be a registered user. Registration is free and easy! Sign up now!
<June 2006>