What do Virtualization and Cloud executives think about 2011? Find out in this VMblog.com series exclusive.
Contributed Article By George Teixeira, president and CEO, DataCore Software
2011 Perspectives on the Shifting Economies of Storage Virtualization Software, Private Clouds and Virtual Desktops
"The business return on investment of a pure ‘software infrastructure,' where you pay once for intelligent software to manage, protect and get more from your storage assets - as they come and go from generation to generation and brand to brand - is a value proposition that is as compelling as it is inevitable."
- George Teixeira, President and CEO, DataCore Software
Prediction: Pent up demand, an ever-demanding competitive landscape, as well as lessons learned from the recession, will lead to a rapid standardization of dynamic, software-based virtual infrastructures, yielding new strategies for storage investments and open hardware choices.
As we saw this year, virtualization is not just about server virtualization anymore. It's about creating agile and enduring infrastructures with software solutions that evolve and adapt over time, allowing you to select whatever hardware you want to use and giving you optimal utilization and on-going ROI dividends over the lifetimes of your IT assets.
Pent up demand has storage hardware vendors hopeful. According to industry reports, the data storage market has roared back in 2010. The pent up demand from those who deferred upgrades or avoided a refresh cycle is real. However, it doesn't signal a trend back to hardware dominated infrastructures. The more telling news is that there remains a large appetite for clouds, virtual desktops and storage virtualization, which highlights the increasing reluctance of buyers to rely strictly upon physical infrastructures.
Customers are justifiably more cautious than they were before the recession, and are, frankly, more savvy as to their options. They are eager for a cushion against rapidly obsoleting hardware devices. When times got tough, this "buy it again and again and again" model left them with few options to advance their infrastructures for more business continuity, mobility and performance. They couldn't repurpose existing hardware, or easily bring in another vendor into the mix. So in effect, they have been asked to replace last year's gear well before its useful life has expired. I believe that we will look back on this recession as the catalyst for a more rapid decline of this inefficient hardware-cycle -it‘s too expensive, too complex and doesn't bring customers much value.
Clearly, IT organizations are seeking an advantage over their physical infrastructures. This is why cloud, virtual desktops, server virtualization and storage virtualization are the talk of the industry. This is also why even traditional storage hardware vendors are trying to recast themselves as "software companies"-they've hit a wall and they can read the writing on it.
So at the same time as we are seeing a rise in storage hardware sales, we are simultaneously seeing a growth in software infrastructure solutions. They provide an advantage, almost insurance, in minimizing the current and future costs of the physical environments they manage; they open up buying choices and other options. As a DataCore customer recently stated, "No hardware vendor can own you-you get a perpetual ‘Get Out of Jail Free' card that releases you from hardware dependence and storage vendor lock-in.
Prediction: In 2011, the threshold for viable Virtual Desktop deployments drops from many thousands of seats to a few hundred thanks to storage virtualization software.
The barrier slowing VDI adoption is well stated by my colleague, Ziya Aral, DataCore's Chairman:
"The problem for Virtual Desktops is that SANs are often implemented with large and costly storage controllers and complex external storage networks. While these have the advantage of achieving reasonable scalability, they introduce a very large threshold cost to virtual desktop implementations. To overcome this high capital cost burden, hardware vendors typically tout the economics of deploying several thousand virtual desktops.
However, Virtual Desktops are still at their introductory stage. Many companies, while understanding the potential benefits of the technology, are introducing pilot programs or attempting to fit initial Virtual Desktop implementations into existing structures. If the granularity of these implementations is to be in the thousands, then the user is forced to consume, not just the ‘whole loaf,' but an entire bakery truck full of loaves at one sitting...and this before even knowing whether the bread tastes good.
The alternative is equally unappetizing. The user ‘bites the bullet' and accepts the high threshold cost and complexity of a full-blown SAN while running far less than the optimal number of Virtual Desktops. Now, the per-desktop cost of the implementation becomes much larger than it would have been if the "old scheme" of discrete desktops had remained. This is the opposite effect wanted on an introduction to a new, ‘cost-saving' technology.
Put another way, the real problem is not in scaling up to ‘thousands' of Virtual Desktops but in scaling them down to practical configurations. It barely needs mentioning that this must occur without radically spiking costs at the low end and also without forgoing the SAN feature set which assures portability, availability, and data redundancy. Otherwise, the very benefits of Virtual Desktops are compromised."
DataCore has done extensive benchmarking to understand the economics of virtual desktops and has been able to configure high-availability storage configurations supporting around 200 virtual desktops. Initial findings have shown that based on the configuration used, we can achieve a total hardware cost per desktop of under $35.00, including the storage infrastructure. This compares to previously published reports, which tout the storage infrastructure costs alone of virtual desktops typically at several hundred dollars per virtual machine. We are continuing our research but believe that based on these initial findings we will make a significant impact on removing storage costs as a primary barrier to virtual desktop deployments.
DataCore will be posting additional updates on our research in early 2011, please feel free to check for updates and the full report at: DataCore addresses the virtual desktop cost challenge.
Prediction: The mid-market jumps on board the virtualization wave. Microsoft Hyper-V and its pricing spurs a new wave of virtualization users, driving automation and management simplicity to become the new norm for storage.
DataCore customers have been realizing the benefits of software virtualization in the storage context for some time. See http://www.datacore.com/Software/Benefits.aspx. Why will 2011 be different?
The recession, for one. It changed what customers require by changing their threshold for what they can't afford to ignore. No longer can they afford to overlook the savings and efficiencies of virtualization. Also, a jobless recovery means businesses are choosing to grow with what they've got, and virtualization gives them the flexibility to repurpose what's already in place, helping to gain the most from prior IT investments.
But there is much more going on that is leading virtualization to the mainstream. We think two very important influences on adoption will result from the combination of what DataCore and Microsoft are doing. Microsoft Hyper-V has spurred another wave of virtualization users. With Windows 2008 R2 and Hyper-V, the larger Microsoft world began to embrace virtualization and this trend will continue to accelerate. With the first wave of virtualization projects complete, more organizations, especially in the mid-market space, are moving to core production applications such as Exchange and SQL. These will require investments beyond servers, including networking, desktops and, of course, storage. These customers are looking for easy-to-use storage solutions that support the high data demands of virtual environments but are intuitive to the Windows community.
This is where DataCore will have a major impact on the mainstream use of virtualization in 2011. Time and the experience gained from thousands of deployments has taught us what we need to do is to take the user and operator's view. We understand that in this broadening virtualization market the operator isn't always going to be a highly trained, storage administrator. What's different in 2011 is that virtualization software solutions must be simple to use. This is critical to broad acceptance. Certainly, what we have coming in terms of next releases will make the change come faster, not just for DataCore but for others who follow in our footsteps.
Our next generation of software will be a testament to the "software infrastructure advantage" - the ability of software to remove complexity and adapt flexibly. It is that basic, but that important.
The "software advantage" also becomes obvious when you consider the many shapes it takes. Because our software is portable, it can run on a VM or on a multitude of physical servers. It not only virtualizes and manages storage, but can coexist along with the server hypervisor in the virtualization layer, providing many possibilities to solve real world and real budget challenges. Hardware simply can't do that.
Prediction: The storage required to support the Clouds will remain too expensive until Clouds move to a hardware independent software-based storage virtualization model.
As with virtual desktops, potential private cloud customers are experiencing "sticker shock" due to demanding storage requirements. This has slowed commercial interest. Clouds by definition are software constructs that provide services as needed when needed. To do so, the combination of virtual servers, virtual storage and virtual networks working together as a combined virtualization layer is the obvious choice.
From our perspective, a cloud computing platform needs to, at its very core, be based on portable software. Many clouds currently are being built from a hardware vendor-specific mindset. So, what is wrong with this picture? Well, the whole point of cloud computing is delivering cost-effective services to users - and that requires the highest degree of flexibility and openness, versus being boxed in to specific hardware that cannot adapt to change over time. Aren't clouds, after all, supposed to be soft and agile?"
Listen to what some of our cloud provider customers have said publicly:
Host.net: "We chose VMware for server virtualization, DataCore for storage virtualization and Cisco for network virtualization in the core design of our vPDC platform because each vendor delivers the very best virtualization component in their respective areas of competence," said Jeffrey Slapp, VP of Virtualization Services for Host.net."Interestingly, we are finding that the demand for storage far outpaces the demand for individual virtual systems. With DataCore's software-based technology, we can deliver a highly flexible storage solution both to the customers of our cloud who are using virtual servers, and to the clients who are still utilizing physical servers in our data centers."
IOmart: "What we have achieved with DataCore SANsymphony is total flexibility, we can now move data 'at will' between vendors and seamlessly achieve virtualization across RAID arrays. No other solution offers us this," states Richard McMahon, IOmart Infrastructure Manager.
Prediction: Data growth will continue to be the biggest infrastructure cost challenge. However, the most important strategic driver in 2011 will be building business continuity and availability into the infrastructure as whole instead of device by device. Without the umbrella of a software virtualization infrastructure for storage, inevitably, continued data growth can't be managed safely and affordably.
Storage has become one of the largest cost factors in IT, typically representing from 1/3 to 2/3 the total cost of hardware. And it continues to grow - with disk capacity needs doubling on average every year.
In a recent large scale survey of over 1000 datacenter and virtualization users, data growth was found to be the biggest data center hardware infrastructure challenge, the findings reported that: top data center hardware infrastructure challenges impact cost to some degree, but data growth is particularly associated with increased costs relative to hardware, software, associated maintenance, administration and services.
In the same survey it was also concluded that: many data center managers were forced to defer infrastructure upgrades and extend technology refresh cycles and, as a result, are now dealing with an aging infrastructure or, in some cases, product obsolescence.
Interestingly, when asked about strategic drivers, the most important drivers of strategic change in their organizations' data centers through the end of 2011, was business continuity and availability, which came in as the top requirement with 50 percent of the over 1000 respondents.
Today, the real focus of concern is not so much about virtualizing servers but more about, "How good is my business continuity and high-availability, and is my shared storage in place to support it?"
The leading role of storage virtualization is finally beginning to be understood due to its critical role in protecting the data and the VMs for business continuity - and this is expected to become an even greater focus of IT solutions providers as well as IT directors well into 2011.
Final Thoughts and Conclusion
The recession, like all major catalysts, quickly changed what customers expect from their future IT purchases. They were forced to react quickly to a volatile economy and a threatening competitive landscape, and came to understand the power of software to help them do this. They were able to get the most from existing hardware assets, which let them defer and sometimes eliminate large ticket storage purchases. The success of initiatives around clouds and virtual desktops will be judged on these same economic principles. By maximizing the value from the resources at hand, DataCore brings a fresh new "software advantage" to the tired game of storage. We look forward to an amazing 2011.
"We finally jettison the large storage tanks that have slowed the virtualization rocket. The heat of their expansion dissipates quietly below us, enabling IT to climb unaffected by gravity or glitches. And we didn't empty our coffers to pull it off." - An enlightened user.
A Simple Test to Make Sure Your Storage Virtualization Infrastructure Is Software-Based
Unlike most storage virtualization vendors who sell "box" solutions but claim they are really software, with DataCore you can download our infrastructure software to virtualize your storage. The ability to download is a simple test on whether the solution is really software.
Please feel free to download and try DataCore software at:
http://www.datacore.com/Software/Closer-Look/Demos.aspx
About the Author
Mr. Teixeira creates and executes the overall strategic direction and vision for DataCore Software.
Mr. Teixeira co-founded the company and has served as CEO and President of DataCore Software since 1998. Prior to that time, Mr. Teixeira served in a number of executive management positions including Worldwide VP of Marketing and GM of the Product Business Group at Encore Computer Corporation, where he also played a major role as the team leader of OEM marketing and sales to Amdahl, IBM, and DEC. His work culminated in the $185 million sale of Encore's storage control business to Sun Microsystems in 1997. He also held a number of senior management positions at the Computer Systems Division of Gould Electronics.