What do Virtualization and Cloud executives think about 2012? Find out in this VMblog.com series exclusive.
Quantum: 2012 Storage Predictions
Contributed
Article by David Chapa, chief technology evangelist, Quantum
1.
Definition of "the
Cloud" will begin to stabilize. Often
throughout 2011, when "the cloud" was discussed almost without question the
conversation around the make-up of cloud was compute, network, virtualization
and lots of disk storage - but if "the cloud" is really going to store as much
data as some of the research firms have estimated - then another chief
ingredient must be included, at least in the near term and that is tape
automation. As much as some may want to proclaim its death, the simple economies
of tape storage quietly, yet firmly state something quite different. In
2012, many will realize the importance of having tape as part of the big cloud
story. Mark Twain and Tape have something in common.
"The
report of my death was an exaggeration"
Mark
Twain said it and tape is undoubtedly storing it somewhere. Tape is not
dead; it is simply being positioned differently to take advantage of its
technology merits to meet customer requirements. So how will the
definition of the cloud begin to stabilize? First of all, there will be a
general acceptance of what "the cloud" consists of, simply what IT is already
managing today in its environment, which again includes compute, storage,
network, virtualization and yes, tape storage. Secondly, when IT recognizes
that it is okay to become the consumers of services versus the providers of
services is when "the cloud" will find much greater definition and segmentation
in this industry. This dynamic will be a big paradigm shift for many.
2.
Big Backup and
Recovery Changes. We have heard it said
"virtualization changes everything" - and that could not be more true for data
protection. Rear Admiral Grace Murray Hopper, one of the inventors of
COBOL, is quoted "The most damaging phrase in the language is, ‘we've always
done it this way.'" With virtualization, IT now has a chance to change without
risk of attack by those naysayers who fear change. When IT makes the plunge
into virtualization it begins to look at the overall environment through new
lenses and those new lenses allow IT to divorce itself from "the way we've
always done it," and look at the better approaches to not just "backup and
recovery" but a real data protection strategy. Backup applications have
always been a very "sticky" application in customer environments, and
virtualization offers customers a choice to change how they think about backup
and recovery. New, thinner and more integrated approaches will
characterize the revised look of data protection in a virtualized world.
Deduplication will become even more critically important to efficiently and
effectively protect these infrastructures without compromising on performance
or oversubscribing secondary storage to compensate for architectural
deficiencies.
3.
Breaking down the barriers for public cloud adoption. One industry voice has said "public clouds are dirty and
unsanitary." Oddly enough, that was because the company this person represented
only had a private cloud solution at the time. Now of course there is talk
of private, public and this thing that is called the "hybrid cloud" - a
combination of both. I will say that to some degree I agree with that
statement about public clouds - to some degree because what gives the
perception of the public clouds being dirty and unsanitary are the three main
concerns IT leaders have with using the public cloud: security, access and
control. Security, access and control continue to be the barriers of
entry for the public cloud to gain enterprise acceptance. What IT wants is
true secured multi-tenant storage solutions. IT needs to know its access
and control of its data will not be limited. Perhaps the problem with
public cloud has to do with so many providers rushing to offer something to
take advantage of the market buzz and hype. Clearly IT needs to interview
the public cloud providers as strictly as it interviews staff to run its own
datacenter - it's that important. Due diligence on the part of IT will
make the difference, and hopefully the cloud providers will "bake in" many of
these learnings and requests from the savvy IT organizations to "clean up" the public cloud perception and image. I
firmly believe that the year 2012 will need to focus on how these three
barriers of security, access and control can be broken down.
###
About the Author
As the Chief
Technology Evangelist with Quantum Corporation, David Chapa is responsible for
representing Quantum's technology direction and solutions. David has invested
over 25 years in the storage industry, focusing specifically on data
protection, data disaster recovery, and business resumption practices. He has
held several senior level technical positions with companies such as
OpenVision, ADIC, Quantum, NetApp and the Enterprise Strategy Group.