DataCore
Software recently unveiled a series of
new initiatives designed to simplify, automate and reward expansion of
software-defined storage (SDS) and
hyperconverged Virtual SAN
throughout the globe. To find out more, VMblog spoke with DataCore's chief product officer, Rizwan Pirani.
VMblog: Software-defined storage has been a rapidly
growing market segment as users increasingly realize the benefits of a software
vs. hardware approach. What do you think are the factors behind this market
shift?
Rizwan Pirani: IT has been on a path to Software-Defined Infrastructure (SDI) for some time.
Virtualization, cloud, and DevOps are all instances that are driven by or
require SDI to work.
Compute
virtualization has been a driver for IT efficiency for the past 15 years or so.
Today, almost every workload is virtualized. Storage virtualization is lagging
server virtualization because storage is far more complex and because data is
the lifeblood of almost every company today. Compute can be ephemeral and can
scale horizontally, while storage needs to be persistent, performant, and
always available.
Software-defined
storage (SDS) takes storage virtualization and adds additional services and
value around it. The main value is in flexibility (avoiding hardware and vendor
dependencies) and agility (simpler provisioning, management, and optimization).
Software-defined storage also offers many economic advantages such
as:
-
The ability
to integrate new technologies quickly
-
Extending
the life of current storage hardware
-
Increasing
the value of past and future investments
-
Hardware and
vendor independence
Hardware
refresh cycles represent one of the most challenging aspects for IT,
especially
storage hardware. Typically, an IT department
will go through a storage refresh cycle every three to five years, but
in some
cases the hardware can be used for a longer period of time.
Software-defined storage is very flexible and lets enterprises add new
storage and technologies-whether
it's AFAs, NVMe, containers or cloud storage-non-disruptively, so they
can be
ready to integrate whatever comes next onto the existing hardware.
Software-defined
storage allows an organization to build that architecture. Once the
latest and
greatest hardware comes out, it can easily be integrated into the
environment,
helping to modernize the data center and increase infrastructure
agility. With software-defined storage, there's no need to rip and
replace when new technologies arrive.
The same is
true about architectures. With SDS it is easier to adopt different types of
storage (direct attach, arrays, JBODs, etc.) and adopt new configurations like hyperconverged
or hybrid-converged with minimal disruption. This reinforces the idea of the flexibility
and freedom offered by SDS.
VMblog: How is the hyperconvergence
market evolving over time? Many of the hardware vendors are shifting from a
hardware to a software offering, is this is sign of maturity or is the market
shifting?
Pirani: Not only are
many of the traditional hardware vendors in the hyperconverged market shifting
from a hardware to a software offering, the market as a whole has shifted from
the earlier vision as "hyperconverged" primarily consisting of the convergence
of compute, storage and network in one single hardware unit into one
encompassing technology that is essentially more of a "hybrid-converged"
infrastructure. This is a sign of the user's maturity in demanding
hyperconverged systems that better meet their needs and vendors are creating optimized
solutions that effectively match those requirements.
Hybrid-converged is a superset of hyperconverged infrastructure that provides
the same capabilities of compute, storage and networking in one "box," which is
typically an x86 server loaded up with drives (either flash or HDD, or any
combination thereof) but with the additional functionality of also connecting
to storage that is external. In other words, it provides all of the benefits of
hyperconverged infrastructure without sacrificing existing Storage Area
Networks (SANs).
With a
hybrid-converged infrastructure, users longer have to choose between a SAN or
an HCI appliance-they can have them both. This way, the capacity of existing
hyperconverged nodes can be expanded by connecting to an external SAN without
needing to shut down the HCI 1.0 nodes, adding more storage or purchasing more
nodes.
This is,
again, an expected evolution towards the vision of software-defined
infrastructure. The basic premise is that everything is defined, controlled,
and managed via software, which means everything is flexible and there should
be minimal hardware constrains. Hyperconverged is really a hardware-centric
configuration model.
VMblog: Many
hardware storage vendors are adding more capable services to their arrays like
mirroring and encryption. What are the benefits of having this intelligence and
the services in a software-defined layer across all storage versus buying it as
part of a storage array?
Pirani: There are a
couple of advantages for IT organizations that decide to have an independent
software-defined layer that offers services like mirroring or encryption. Here
are a few:
-
The services
and their management are consistent across all storage systems, independent of
vendors or types of storage.
-
The services
work across storage arrays. This means, for example, the system can dynamically
auto-tier between an older array, cloud-based secondary storage, a newer flash
array, and direct-attached NVMe storage, automatically.
-
A
software-defined storage layer eliminates migrations. Decommissioning, or adding
a new array is a simple as adding it to the pool and assigning a performance
tier level.
-
IT avoids
being locked-in by a storage vendor on multi-year contracts for their
technology. Instead they have the freedom to use any storage system they want,
including low-cost commodity storage, which can perform great given the
services are in a separate layer.
-
SDS software
is usually sold under a perpetual license, which can be very advantageous
financially. The software can be upgraded easily. In DataCore's case, we have
moved away from charging for individual features, which means every customer
(including those who purchased DataCore years ago) enjoys all of the features
in their license tier, including auto-tiering, thin provisioning, mirroring,
and continuous data protection.
VMblog: What should
users look for when choosing the optimal software-defined storage solutions?
Pirani: Buyers
should look at three main elements when evaluating software-defined storage:
- Flexibility.
This is one of the main reasons why IT departments should be building their
strategy on top of software-defined infrastructure. Therefore, it is important
that the platform allows full flexibility of hardware choices, network (iSCSI
or FibreChannel), API accessibility, container-readiness, etc.
- Completeness.
An IT department requires a solution that provides all the services that make a
powerful SDS deployment: dynamic auto-tiering, performance acceleration,
mirroring and snapshots, continuous data protection, thin provisioning, and
ease of management.
- Maturity.
Today, data is the lifeblood of a company. Your data is your business. It is
important to be able to trust a software platform that is reliable, proven, and
tested.
VMblog: What other market
trends do you find interesting in the space and how will they impact DataCore's
growth?
Pirani: I find the market trends of data analytics and cloud maturity particularly
interesting. In terms of data analytics, as more enterprises look toward big
data for future opportunities, it can become the cornerstone of positive
transformation when collectively transitioned into actionable insights.
We will increasingly employ the use of algorithms, machine leaning, and
artificial intelligence (AI) to derive insights. With the collection,
synthesis, analytics and visualization models now available, big data is being
transformed into data-driven intelligence and this will be a trend that continues to impact the industry in the
coming years.
The evolution, however, has only just begun.
When data analytics meets "predictability," the immense power of "prevention is
better than cure" is possible. The leverage of predictability relieves
invaluable human cycles for more abstracted, sophisticated transactions while
allowing compute/machine learning/AI to forecast remedies pre-emptively.
These logical deductions performed
pre-emptively create a data-driven set of remedies. As businesses look
toward big data for future opportunities and that data continues to
get bigger and bigger, IT departments will need to embrace technologies
such as
software-defined storage to confront these challenges and achieve the
promise
that big data holds.
Regarding
cloud maturity, I believe the IT industry has finally realized the place and
time for cloud. It is not the panacea everyone expected a few years ago. It is
also not a cost-cutting tool. The industry now understands the cloud is one
more tool at its disposal, and that datacenters are going to be around for a
while.
By
the way, the line between ‘traditional' datacenters and the cloud has been
blurring, as the former are now mostly virtualized resources on a co-location
environment. The evolution towards software-defined infrastructure, whether
those look like virtualized resources or public cloud, is what will give IT the
flexibility and freedom.
Most
IT departments should know by now what they plan to host in the cloud and what
they plan to host themselves. There are many reasons to do one or the other.
The reality is that the future is hybrid, IT needs to build infrastructure that
leverages the cloud as an extension of the on-premises infrastructure.
The
first step is to become software-defined. To break the silos, achieve vendor
independence, and break free from three-year vendor-imposed refresh cycles.
I'll
say one last thing about trends: remember that they come and go. Every trend
has a new set of cool vendors. There is a group of storage array vendors and there
are the AFA and HCI vendors. The risk is investing with one of these vendors
for the long term, ignoring the fact that technology comes in cycles and that
in a few years there are probably going to be a new set of technologies with a new
set of vendors you want to do business with. This is one more reason to go
software-defined; you can have the freedom to use whatever technology and
whichever vendor your organization needs at any time.
VMblog: How does DataCore enable customers to shift storage spending from a
passive, recurring expense to a more valuable investment in software-defined
infrastructure?
Pirani: We
mentioned the ability of SDS to provide advanced data services across old and
new storage systems, but there are three important financial benefits IT
departments that deploy DataCore enjoy:
First,
is Parallel I/O technology and advanced caching. I realize that most companies
promise high performance, but there is only one company that smashed the
previous record based on the SPC benchmark. Most DataCore customers see a 5X
increase in performance through a variety of technologies. This means an older
storage array can produce flash-like performance, which could delay or avoid a
storage refresh cycle.
Second,
thin provisioning and intelligent pooling result in massively increased storage
utilization. Some IT departments see up to 50% more storage efficiency, which
again can translate into significant savings in storage hardware costs.
Third
is efficiency. Every IT department is under-staffed. Storage teams spend way
too much time managing storage, provisioning for users, and fork-lift migrating
to new arrays. All this work is essentially eliminated with an intelligent SDS
layer that moves around data as needed, provisions storage from the same VMware
and Hyper-V consoles where virtual servers are created, and provides a set of
services that automate storage management.
VMblog: How does DataCore help users to maximize IT
infrastructure performance, availability and utilization?
Pirani: To get better performance and availability, enterprises can't just "rip
and replace" hardware every few years because it's both expensive and
complicated. DataCore helps to optimize the underlying infrastructure,
increasing the performance, availability and utilization of all applications.
DataCore's software works with
existing infrastructure, extending the useful life of the equipment while
giving companies the flexibility to pick the hardware, technologies and
platforms to meet their needs and leverage existing IT investments when
possible.
Better utilization of current infrastructure
and seamless integration of advanced technology reduces costs and gives users
more control on when to make hardware investments. DataCore's services are also
hardware-agnostic, providing a unified storage layer for applications and
management across different technologies and infrastructure vendors, easing
administrative burdens. New technologies that work with standard x86 hardware
can be easily integrated, from AFAs to NVMe, supplementing the capabilities of
the environment and simplifying the process of assigning the right storage tier
to each application.
To answer your question more specifically, DataCore
developed a series of very unique technologies, including Parallel I/O and
advanced caching, that eliminate bottlenecks and optimize all storage systems,
old and new. The average DataCore customer sees a 5x performance increase. We
have been holding the SPC price-performance benchmark for years. This means that
old arrays may be able to run as fast as some of the newer AFAs. Further,
dynamic auto-tiering across storage systems allows an IT department to add a
little direct attached NVMe and enjoy incredible performance gains.
DataCore also has been a leader in terms of
availability, supporting synchronous mirroring in local and metro clusters,
asynchronous replication for disaster recovery, and continuous data protection,
which is like a time machine to undo any damage from ransomware attacks. The
most important aspect to ensure availability is the recovery-recovery is
instantaneous and automatic, with zero touch fall back and re-build. Our
technology is mature, and it just works. Many of our customers have seen zero
downtime for many years.
In terms of utilization, we talked about the
ability to get maximum value from existing investments. There is also thin
provisioning, another technology we developed and patented. It simply
eliminates the inefficiencies of storage allocations. Combined with
auto-tiering across dissimilar storage systems, organizations can see 50% to
70% increase in storage utilization.
VMblog: DataCore
recently announced a new series of initiatives designed
to simplify, automate and reward expansion of its software-defined
storage (SDS) and hyperconverged Virtual SAN products (e.g. product enhancements, a simplified licensing/pricing model, new
customer loyalty program and community platform). As one of the team members
directly involved in developing/overseeing these programs, can you tell me a
little bit more each and what this means to your customers? Your partners? The market as a whole?
Pirani: Partners will enjoy community forums where they (and their customers) can interact with other DataCore users to exchange best practices or discuss important topics. However, the main value for partners is the new simplified pricing. It is based in two factors: the deployment model and the amount of capacity (TBs). The new pricing model includes all of the features and has no limitation in terms of number of nodes, so an IT department has the flexibility to add nodes as needed.
The new initiatives ensure that customers derive higher value from existing hardware and software investments, driving accelerated adoption and expansion of software-defined storage technologies worldwide. Elements such as stepped-up discounts based on lifetime capacity consumption as well as no-charge feature upgrades will make it even more attractive for the market to scale up and out for added resiliency and throughput in pursuit of real-time, always-on data.
DataCore’s new simplified licensing/pricing model also offers greater flexibility, allowing customers to transition at their own pace from conventional 3-tier SANs to 2-tier servers SANs, or quickly collapse into hyperconverged systems – whether on-premises, at the edge or in the cloud.
As IT department are looking to reap the benefits of the software-defined datacenter, customers will be able to quickly realize the performance, uptime and flexibility advantages of software-defined storage. This will help them spend less time on repetitive tasks and expand the technology to cover more of their IT footprint, including additional workloads or datacenters.
VMblog: Is DataCore doing anything in the space of predictive analytics
and machine learning?
Pirani: Predictive data analytics is a domain that DataCore is uniquely suited to
helping its customers capitalize on. In reality, DataCore has been using machine
learning for many years. For example, our patented caching algorithms use
machine learning to provide the most efficient caching possible.
Today,
DataCore servers can transmit high-level telemetry data to a central server.
This data (which is anonymous and opt-in) is available now for support
purposes. We are currently working on innovative ways to leverage it to analyze
it using a series of AI-assisted analysis to do predictive support, analyze
patterns, or provide other intelligent services to customers.
VMblog: DataCore is expanding its support for critical new technologies
such as containers and NVMe. Can you tell me more about these areas?
Pirani: Container data services are expected to revolutionize the computing industry
with innovative solutions, and we are well placed to offer monumental value in
this space. It is very clear that containers are here to stay and will play a
big role in IT in the coming years - maybe even over decades.
Containers
bring unprecedented mobility, ease and efficiency for rapidly deploying and
updating applications. At the same time, they introduce new storage
challenges-creating an opportunity for innovation and differentiation.
As container
deployments move from evaluation and testing phases to production deployments,
IT organization require an ability to deliver the same data storage services
that are currently providing to monolithic application architectures. More
importantly, a solution has to be capable of providing shared storage to
existing virtualized and bare-metal application infrastructures, as well as
allow DevOps engineers to consume storage on-demand, ensure stateful
application data is persisted, and provide the same level of availability and
performance as currently provided to the traditional application
infrastructures.
Software-defined
storage can provide an ability for administrators to present persistent storage
to container hosts deployed as VMs on virtual hosts, with the ability to
provide persistent storage to container hosts deployed on bare-metal as a next
step. The presentation of the persistent storage should be done through native
controls of orchestration solutions, like Kubernetes, and leverage advanced
storage capabilities like CDP, auto-tiering and synchronous mirroring.
As a result, users can manage
the provisioning of storage to container deployments, with the same platform as
the rest of the application workloads, and provide the same level of enterprise
storage services required for all critical production environments.
VMblog: What is your
ultimate vision for DataCore's product lines/technology?
Pirani: DataCore
continues to be a technological leader in the software-defined storage and hyperconverged infrastructure market. Its products are immensely applicable to
enterprises today, as evidenced by the exceptional growth of the company and
diversity of the partner community. The deep emphasis on features such as high
availability and disaster recovery, along with the plethora of data services at
exceptional performance levels, has been quoted by many customers, partners--
and even competitors. But perhaps the biggest endorsement of value stems from
the ROI achieved by customers by the elimination of the incessant hardware
refresh costs in their datacenters.
With this stable and enormous base of
exceptional technical value, we are now primed to transform datacenters with
focused investments in modern technologies. The applicability of prescriptive
services along with proactive support for an efficient datacenter will help make
DataCore the secret weapon for smarter IT departments that will use SDS as the
foundation for modernizing their datacenter and delivering business value.
##