Virtualization Technology News and Information
Article
RSS
3PAR's Executive Perspective on Storage Virtualization in 2009

What do virtualization executives think about 2009?  A VMBlog.com Series Exclusive.

Contributed by Craig Nunes, 3PAR Vice President of Marketing 

As we round the corner into 2009, many IT industry veterans are hanging their hats on the hopes of virtualization to get us through these troubled economic times. With mounting interest in cloud and self-service computing as delivery models for enterprise IT, it’s increasingly important for organizations to build cost-effective and sharable virtualized IT infrastructures based on utility computing architectures. Just as investment in alternative energy research and new green technologies are being touted as the panacea for our political, economic, and social woes, virtualization now finds itself—like the rest of us—under tremendous pressure to deliver the goods.

Untapped revenue streams are also an important theme for the coming year. There are countless opportunities for monetizing storage—as evident in the rise of Storage- and Software-as-a-Service and the adoption of cloud computing and cloud storage. 2009 will no doubt bring exciting innovation in this area, particularly as service providers evolve their business models and as more companies turn to storage and software service providers for help reducing costs.

But regardless of how you approach it, the singular fact remains that dual pressures placed on service providers and enterprises alike to innovate while also reducing costs have produced a palpable need for smart, strategic storage. Sure, the buzzwords revolve in and out: cloud storage, cloud computing, utility computing, virtual computing. But in essence these are all variations on the same fundamental solution to the same age-old problem: the runaway train of data growth has continued to leave companies with an insatiable thirst for storage with increased pressure to reduce costs, to become more efficient, to do more with less. And virtualization promises to help them do just this.

Gone are the days of throwing more disks at your storage problems. 2008 showed us that the time has come to get smart. And in looking ahead to 2009, it’s clear that lean times call for thin storage. Not only have thin technologies such as thin provisioning been shown to reduce total cost of data by 50%, but they also lower up-front storage costs, help organizations cut their energy and cooling expenses, and allow them to increase administrative efficiency by 10x. In Wave 11 of TheInfoPro’s Storage Study, released in October 2008, 87% of respondents acknowledged the importance of thin provisioning to their own datacenter plans.

For hosting and SaaS companies in particular, thin provisioning and utility storage have been a godsend because they enable new customers to be added to the storage environment in a matter of minutes instead of weeks with minimal pre-dedicated capacity and zero disruption. And as the trend towards IT outsourcing grows, this infrastructure agility will be even more strategic—and not just for service providers, but also for internal service bureaus such as IT organizations within enterprises and government agencies.

But perhaps one of the most interesting revelations of 2008 that will continue to play out over the coming year is that, if you are using server virtualization, you need storage virtualization. As organizations build out their virtualized infrastructures to support the delivery of enterprise IT as a utility service via cloud and self-service computing models, they are increasingly turning to server virtualization, blade servers, and utility storage technologies. The cost savings beast is out of the box and needs to be fed, and this is forcing it down into the storage infrastructure. This means that in the coming year we will continue to see more and more organizations turning to thin provisioning and block-level virtualization to extend server virtualization benefits and enhance ROI. Even VMware has lauded this approach and has gone so far as to tout thin provisioning as the way to go.

This also means that in the world of storage virtualization, we’re finally going to see thin provisioning evolved to the next level. In 2008 we saw an explosion of storage vendors announcing support for thin provisioning—some “thinner” than others—but it was the introduction of the world’s first thin storage processor in September 2008 that will really set the tone for 2009. Simply put, the introduction of thin processing into storage arrays represents the most significant innovation in storage virtualization since the advent of thin provisioning. In essence, thin processing takes thin technology to the next level by supporting fat-to-thin conversion in silicon for greater written capacity utilization, faster performance, and increased system automation. “Thin Provisioning 2.0” is finally here and with it has come the possibility of performing ongoing, automated optimization of thin provisioned volumes so “thin” volumes can stay thin.

But thin processing is really only the tip of the iceberg. Perhaps the most exciting aspect of this new area of innovation is that it opens up a whole new realm of possibilities for the development of new thin storage applications—“Thin Provisioning 3.0”—which we’ll see develop as 2009 unfolds. One early example of this is the work going on between 3PAR and Symantec, who have teamed together to bring to market an industry first—the automated use of filesystem-level intelligence to continually optimize storage utilization and deliver infrastructure automation. This joint development work enables 3PAR arrays to use granular filesystem-level information to autonomically reclaim unused space within thinly provisioned virtual volumes, thus maintaining high capacity utilization without impacting host applications.

All of this only begins to hint at what 2009 has in store for thin technologies and the storage virtualization market. But regardless of what innovations lie in the not-so-distant future, rest assured that continued optimization of storage utilization coupled with advanced infrastructure automation will continue to be critical themes for storage virtualization in 2009.

Craig Nunes, 3PAR Vice President of Marketing

Published Tuesday, December 09, 2008 5:48 AM by David Marshall
Filed under:
Comments
There are no comments for this post.
To post a comment, you must be a registered user. Registration is free and easy! Sign up now!
top25
Calendar
<December 2008>
SuMoTuWeThFrSa
30123456
78910111213
14151617181920
21222324252627
28293031123
45678910