Industry executives and experts share their predictions for 2020. Read them in this 12th annual VMblog.com series exclusive.
By Stan
Zaffos, Sr. VP, Product Marketing, Infinidat
Data Storage in 2020 - The Move to Smarter Storage
Data
storage has come a long way in the 2000s - evolving from storage arrays and
archived tape to the AI-powered, software-centric systems that are integral to
today's business transformations. What does the first year of the 2020s have in
store for storage? Expect more innovation, more "smarts," greater concerns
about data protection, and a more strategic role for your organization's
storage leader.
5G will Accelerate
the Volume and Velocity of Data Collected
2020 will
be the year 5G enters the mainstream. The new wireless standard will start
generating real value for companies deploying IoT projects. It will also start
to cause real headaches for procurement leaders who get bills for all the new
data that needs to be analyzed and stored.
The projections
are eye opening. Gartner predicts 66% of organizations will deploy 5G and 59%
will include IoT communications in the use-case for 5G. The number of IoT
devices will nearly triple from 2017 to 2025, up to 73 billion, according to
IHS Markit. A total of 90 zettabytes (1 billion terabytes) of data will be
created on those IoT devices by 2025 - roughly half the 175 zettabytes
currently created by all computing worldwide.
Companies
deploying IoT projects will need to plan for the data deluge that is coming.
They'll need to set up infrastructure and processes to filter the data,
pre-analyze it, categorize it, store it and dispose of it.
Today,
companies are operating at equilibrium, but 5G will saturate their pipelines.
Companies that plan well and allow for some flexibility will execute successful
projects at reasonable costs. Those that don't will spend only what they can but be limited in the value
they unlock in the limited data they capture and store.
AI Applications
Will Usher in the Age of ‘Smarter Storage'
Artificial
intelligence (AI) applications have become so sophisticated, and so ubiquitous,
many of us take their complexity for granted. Better known AI-driven
applications include autonomous vehicles, online assistants, image recognition
in photographs and smart games powered by deep learning (such as Chess or Go).
AI
workloads will continue to generate business value in 2020. But, for
organizations to increase their reliance on AI, storage vendors will need to make it easier for
AI applications to access more data faster, in turn helping the systems learn
faster and unlock the value of the data.
AI and
deep learning use very large data sets and very fast input/output (I/O)
communications. For algorithms to become more accurate, the systems need to
perform streams of calculations on large stacks of data. Eventually they spot
patterns and get smarter.
As we
enter 2020, data sets are getting bigger and demands for instantaneous decision
making are becoming more prevalent. This puts stress on the training systems.
Expect more demand for smarter storage systems to match the escalating
intelligence of the applications themselves. We'll see more investments in
tools like software-defined switches to open up more pathways for hardcore
analytics; QoS functions to dole out information more strategically; scale-out
system architectures; and the ability to deliver data with lower latency.
A National Data Breach Standard in the U.S. Will Put Pressure on Storage
Systems
The day
of reckoning may not arrive until after 2020, but it's on the way. If the U.S.
adopts a national data breach standard that requires companies to encrypt the
bulk of their information, many will suddenly find themselves scrambling for
more storage space.
The
problem is with de-duplication technology. Many storage systems on the market
use this technology as a way to reduce their storage needs. The technology
looks for patterns and stores a pointer to the first instance of the data.
Vendors using the technology can sell low amounts of storage space that expands
by several orders of magnitude because the duplicate information doesn't get
stored multiple times.
When you
encrypt data, the de-duplication advantage goes away. Encrypted data is
completely random, so there are no patterns to point to. This means storage
systems that use the technology suddenly have much less space on hand. This
will force companies to buy more storage to cover the shortfall - or look for
alternatives to de-duplication technology.
GDPR in
Europe has language recommending encryption as a best practice but no
across-the-board rules requiring it. In the U.S. certain states are pursuing
GDPR-like privacy rules. If the U.S. passes a national standard mandating
encryption, that would change the storage landscape.
Containers Will Create a More Competitive Storage Environment
Containers
and multi-cloud implementations have exploded in recent years. This trend will
accelerate in 2020. More enterprises will push to create flexible computing
environments where multiple clouds serve specific strategic purposes. They will
embrace the flexibility containers promise, creating set-ups where containers
can move freely between public cloud, private cloud and on-premises
environments.
Aside
from making I/O profiles more storage friendly, this flexibility will help
improve operational efficiency both on the server side and in storage by
simplifying deployment and enabling applications to move between on-premises
and various off-premises environments.
Increased
use of containers and Kubernetes will help create a more competitive storage
environment. Being able to port workloads seamlessly among diverse environments
will diminish the strength of vendor lock-in and put pressure on incumbent
storage vendors to innovate in areas that improve financial and operational efficiency:
lower acquisition and ownership costs; improve staff productivity via more
autonomic operation, cloud integration; and new pricing and service offerings.
By 2023 self-Renewing
Storage Infrastructures Will Become an On-Premises Reality
Self-renewing storage enables us to provide the illusion of
immortal storage by making data migrations and tuning things of the past: the
prerequisite to storing massive amounts of immortal data. Self-renewing storage
takes the ideas of fault tolerance, self-healing, and virtualization to the
next level to create a storage solution that is elastic, intelligent, and
always on. Fault tolerance of the storage system to service I/O's in the presence
of hardware failures. Self-healing enables storage arrays not disruptively
restore full performance prior to hardware failures being non-disruptively
repaired.
AI and policy-based orchestration (i.e. data placement)
combined with tight integration with the cloud will allow data to reside where
it naturally belongs based on the potentially conflicting capacity,
performance, and cost demands that exist within any organization without human
intervention beyond accepting recommendations. Within the context of
self-renewing storage, AI becomes an embedded component of the storage
infrastructure rather than an application workload. Self-renewing storage
shrinks ownership costs by reducing the time and resources needed to
decommission older storage systems which in turn lowers the strength of vendor
lock-ins and the cost of infrastructure refreshes: big deals when trying to satisfy
insatiable data growth while extending the service lives of installed storage
systems.
##
About the Author
Stanley Zaffos is the senior vice president of product
marketing at Infinidat, a
provider of enterprise storage solutions. As a former research vice president
at Gartner, Stanley oversaw the firm's Magic Quadrants for General Purpose Storage
Arrays and conducted research on hybrid and solid state or flash storage
arrays, software-defined-storage, HCI, replication technologies, acquisition
and asset management strategies.