Written by Goran Garevski, Vice President Engineering, Comtrade Software
It's fascinating how IT has evolved from its beginning. Initially
computers started as basic batch processors, albeit very large. Then as if they
weren't large enough to begin with, they were made bigger, and bigger, and
bigger...Do mainframes sound familiar? That is, until the concept and cost of IT
at that time exploded into micro components, resulting in a tectonic shift
toward open systems IT.
It didn't take long before everything
started to scale-up again. Starting with siloed servers with direct attached
storage (DAS), then siloed servers with virtualized storage, then virtual
computing with virtualized storage. IT organizations ended up with a
heterogeneous infrastructure with questionable utilization, maintenance
nightmares and scalability issues.
Enter the era of virtualization and
cloud computing and the rise of hyperconverged infrastructures. Hyperconverged
systems addressed the major challenges of open systems by:
-
Simplifying usage and maintenance
-
Enabling high resource utilization
-
Providing scale-out capabilities
Evolution of Data Protection
Data protection was also evolving throughout these shifts
and as new concepts emerged. In the mainframe world, one expected data
protection to be integrated in the system. In the open systems environments,
data protection was also "open." Almost too open.
Anyone familiar with agents, media servers, backup servers? Let
me help you. What about dedicated backup admins, backup implementation projects
measured in months, professional services for upgrades, PhD pre-requisites for
finding your way through the backup console? Did I mention upgrades were also
measured in months?
But again, it was not perceived as a problem. As it was a
similar approach for everything, not just for data protection.
Virtualization as Disruption
Although, not a new concept, server virtualization
introduced major disruptions in IT: new approach, new workflows, new I/O
patterns. Also, by focusing on specific problems like VM backup and recovery,
and empowering the virtualization admins, some new VM data protection vendors
turned the world upside down bypassing the enterprise backup tools, and
conquering the virtual on premises IT world. Flexibility, integration and
concept alignment. Sounds familiar?
The downside of VM-focused
data protection was handicapped application data protection. Moving blocks left
and right is not enough, one needs to take care that the business data is
consistent when backed up and recoverable when needed. But it was fascinating
how the industry closed its eyes and continued to explore a VM-centric
movement.
Hyperconverged Data Protection
New IT architectures are attractive as in many situations
they are used within products to address specific use cases. The most
definitely holds true for hyperconverged architectures. New data protection
vendors did emerge with internal hyperconverged architectures, but focused rather
on legacy challenges of dispersed backup infrastructure rather than being ideal
backup solution for the new on-premises clouds (hyperconverged).
The truth is hyperconverged data protection vendors have
major technology, feature and cost overlaps with the hyperconverged
infrastructure vendors. Which begs the question, would you buy a new house just
because you need a garage?
Data Protection for Hyperconverged
Learning from data protection history we see that the most
successful data protection solutions in each era were the solutions that were
focused, conceptually aligned and INTEGRATED with the respective
infrastructure. The key word here being, "INTEGRATED."
Hyperconverged infrastructure is not just an infrastructure.
It is a completely automated cloud, that scales with the business needs, and is
simple to use and provision. It comes with an advanced management stack,
self-service portals and orchestration capabilities. In many ways, it is like
the public cloud, just on premises. It's your LEGO IT - you can use it, play
with it and hug it. But, wait a minute! How do you protect it?
When you try to protect HCI with enterprise backup tools
(the ones from the open systems world) it just doesn't feel OK. It costs even
more. Imagine you build your HCI in a few hours. Great! Then you implement the
backup in a week, you have zero integration in the HCI workflows, and it
usually requires a separate IT group to sync with. And then you need to upgrade
it... That means budgeting, planning, time, professional services. Repeat that
again for the next upgrade of your backup solution or of the HCI solution. The
overall experience? Like buying a new car without a stereo and having the whole
orchestra (read professional services) following you, each one with its own
instrument (management tool) and on different transport vehicle (devices).
What about the virtual backup tools? Can they do the job? Of
course they can, but let's see how. People that were around the software
industry know that once you burn your concepts in the initial architecture of
the product it is almost impossible to change them as new concepts arrive. Virtual
backup tools treat the hypervisor as a first class citizen, providing slightly
increased usability than the enterprise (open systems) backup tools. Meaning
they pretty much ignore the rest of the management and technology stack. So all
of the disruptive and valuable concepts of the HCI are ignored. Typical
symptoms: VM stuns, multiple management consoles, limited automation. Like buying
a brand new car with a huger speaker attached on the roof.
To really work, the HCI concept requires fully integrated
data protection. Point. And, it is not just the ability to integrate into HCI's
snapshotting API. Any storage vendor provides freemium software for that. If
the infrastructure is simple and invisible in the HCI world, the same should
apply for the data protection. It should have native integration into the
overall HCI workflows, self-service portals, and application life-cycle management
portals. It should be a service. Ideally, Data Protection as a Service (DPaaS).
Like having a brand-new car with premium sound system that feels a part of the
car. Like it should be.
Selecting Data Protection for HCI
At the end, it is a decision for the everyday admin and IT
manager what should be used to protect modern HCI environments. Each IT group has
its own budgetary and operational specifics. However, there are three things to
be careful about:
-
If your backup solution cannot keep it up with
your HCI infrastructure something is wrong
-
If your backup solution costs almost the same as
your HCI infrastructure something is very wrong
-
If your backup solution is not manageable by
your HCI stuff something is really, really wrong.
At the end, what's the experience with your sound system?
##
About the Author
Goran Garevski, Vice President, Engineering, Comtrade Software
Goran Garevski joined Comtrade Group in 1993, reaching the
highest technical level in 2001. From 2002 to 2004, he was CTO of StorScape
(storage resource management startup). Following that he has held various
business development and management roles within Comtrade, including his
current role as VP of Engineering for Comtrade Software.
Mr. Garevski is a storage and data management industry
executive with a unique combination of deep market understanding and
comprehensive technology insight. Over the last 20 years, he has led and
contributed to numerous breakthrough projects that have influenced the storage
industry. Mr. Garevski has extensive experience in data protection,
virtualization and cloud storage. He holds a BSc in Computer Science from the
University of Ljubljana.