Virtualization Technology News and Information
Primary Data 2018 Predictions: 7 Trends will Impact Enterprise Priorities

VMblog Predictions 2018

Industry executives and experts share their predictions for 2018.  Read them in this 10th annual series exclusive.

Contributed by Lance Smith, CEO, Primary Data

7 Trends will Impact Enterprise Priorities

From software defined storage, management over hyperconverged infrastructure, and how AI and machine learning can be applied to storage services, the past 12 months have seen a number of tech trends gain traction and adoption. As these trends continue to gain momentum, they still require further innovation and strategic considerations to ensure a successful implementation. Below are my predictions on how these seven trends will impact enterprise priorities in 2018:

Software solves storage complexity

Few IT professionals admit to a love of buzzwords, and one of the biggest offenders in the last few years is the term, "software-defined storage." With marketers borrowing from the successes of "software-defined-networking", the use of "SDS" attempts all kinds of claims. Yet the term does little to help most of us to understand what a specific SDS product can do. Despite the well-earned dislike of the phrase, true software-defined storage solutions will continue to gain traction because they try to bridge the gap between legacy infrastructure and modern storage needs. In fact, even as hardware sales declines, IDC forecasts that the SDS market will grow at a rate of 13.5% from 2017 - 2021, growing to a $16.2B market by the end of the forecast period. 

IT begins to choose custom management over hyperconverged

Hyperconverged infrastructure (HCI) aims to meet data's changing needs through automatic tiering and centralized management. HCI systems have plenty of appeal as a fast fix to pay as you grow, but in the long run, these systems represent just another larger silo for enterprises to manage. In addition, since hyperconverged systems frequently require proprietary or dedicated hardware, customer choice is limited when more compute or storage is needed. Most environments don't require both compute and storage in equal measure, so their budget is wasted when only more CPU or more capacity is really what applications need. Most HCI architecture rely on layers of caches to ensure good storage performance.  Unfortunately, performance is not guaranteed when a set of applications running in a compute node overruns a caches capacity.  As IT begins to custom-tailor storage capabilities to real data needs with metadata management software, enterprises will begin to move away from bulk deployments of hyperconverged infrastructure and instead embrace a more strategic data management role that leverages precise storage capabilities on premises and into the cloud.

Analytics adoption fuels storage and data intelligence

Storage has long been blind to actual demands on data beyond what was recently read or written. Applications have not been storage-aware, and vice-versa, leaving IT in the dark about what data is hot and what is inactive, or cold. Metadata management solutions are bringing an end to these dark ages of educated guesses and the costly over-provisioning of resources. With metadata intelligence, admins and software can finally see when files have been last opened, how often, by who, when they were changed, modified, and more. IT can use this intelligence to manage their storage resources more efficiently, and align the right data to the resource that meets business needs for performance and price. Data that unexpectedly gets hot can move to a performance tier, like an all flash array, and cold data can finally be moved off an expensive storage tier and onto an archival cloud or object tier, in real-time.

Storage silos break down, agility builds up

Flash, shared, and cloud storage give IT teams more options than ever in addressing specific application or business needs. However, most storage systems have long been architected as islands unto themselves, and don't really connect well with others. This means that most IT teams at businesses operating for more than a few years are managing a complex collection of individual storage systems with different capabilities depending on their age. With data virtualization, IT teams can finally create a global namespace that makes these different resources simultaneously available to applications. IT can then set policies to automatically align the right data to the right resource from existing infrastructure and into the cloud, gaining the agility to meet evolving needs by managing data and storage much more effectively.

Storage services become bespoke with machine learning

With the ability to add software to collect metadata from application uses, analyze metadata across the enterprise storage infrastructure, and align data requirements to storage system capabilities, admins gain the power of machine learning intelligence. Rather than applying a one-size-fits-all solution, IT can custom match business needs with existing or incrementally add storage resources, becoming a much more strategic operation. Over time, data management software with machine learning intelligence can help admins continue to refine their policies and fine-tune how they can ensure both performance and savings by optimizing how they are leveraging their unique infrastructure.

Manual migrations fade to black

While manual migrations won't disappear in 2018, those on the frontlines of innovation will begin to put them to rest. Migrations are one of the top frustrations for many IT professionals, because they are complex, risky, costly and time-consuming. Yet with software that can enable automated data mobility, data can overcome the gravity that makes migrations so complicated. Once different storage resources are connected in a global namespace, data can move freely to the right resource to meet evolving needs. If IT needs to decommission an old storage system, software can enable them to set objectives and determine where would be the best new home for the data on the old system, then put an automated migration in motion in just a few clicks. Software that enables two-way traffic - the ability to move cold data back to on-premise resources or higher performance storage - gives admins peace of mind that they have complete control over data management, and open-source technologies such as the Flex Files feature in the NFS v4.2, enable even active data to be moved without risk to data integrity.

Storage admins become data admins

No one says, "storage is the lifeblood of our economy" - it's all about the data. As such, with software that enables IT to finally manage data across storage, the role of the storage admin will evolve. With the ability to see data activity through metadata analytics, admins will shift from manually managing capacity and provisioning new storage to assessing strategies that meet application owner requirements. Machine learning software will assist in much of this work. Data admins will even be able to create service catalogs from which application admins can self-select the levels of performance and protection they need and budget accordingly.


About the Author

Lance Smith 

Primary Data CEO Lance Smith is a strategic industry visionary who has architected and executed growth strategies for disruptive technologies throughout his career. Prior to joining Primary Data, Lance served as Senior Vice President and General Manager of SanDisk Corporation IO Memory Solutions, following the SanDisk acquisition of Fusion-io in 2014. He served as Chief Operating Officer of Fusion-io from April 2010 and Fusion-io President from August 2013, where he was responsible for productizing a number of Fusion-io solutions. Before joining Fusion-io, Lance held Vice President and Director positions at RMI Corporation, Raza Foundries, and the Computational Products group of Advanced Micro Devices. He has also held management roles at technology companies NexGen, Inc. and Chips & Technologies, Inc. Lance holds patents in microprocessor bus architectures, and received a Bachelor of Science degree in Electrical Engineering from the Santa Clara University, where he focused on Digital Electronics and Semiconductor Physics.

Published Monday, November 06, 2017 7:38 AM by David Marshall
There are no comments for this post.
To post a comment, you must be a registered user. Registration is free and easy! Sign up now!
<November 2017>