Virtualization Technology News and Information
Article
RSS
DataCore Software 2021 Predictions: Storage will be Smarter, and Enable IT to be Future-Ready

vmblog 2021 prediction series 

Industry executives and experts share their predictions for 2021.  Read them in this 13th annual VMblog.com series exclusive.

Storage will be Smarter, and Enable IT to be Future-Ready

By Gerardo A. Dada, CMO, DataCore Software

Software-Defined Storage Will Increasingly Become Essential for IT Infrastructure to Achieve Flexibility and Cost Efficiencies

The storage industry has been relatively slow to adopt virtualization as compared to networking, compute, and even security. Instead, a large portion of the industry still exists in a hardware-centric mindset that locks IT departments to specific vendors, technologies, and architectures.

Software-defined storage (SDS) will increasingly help make storage smarter, more effective, and easier to manage, while enabling IT to be ready for the future. Because virtualization abstracts hardware completely, SDS has the power to manage, optimize and simplify all kinds of storage: primary, secondary and backup/archive -all under one unified platform that provides consistent services across different classes of storage, and managed under one predictive analytics dashboard.

This is the result of increasing demand by IT managers for the modernization of IT infrastructure through a uniform control plane capable of tapping the full potential of diverse equipment. Software-defined storage will be increasingly recognized as the solution to accomplish this, helping to break silos and hardware dependencies and enabling smarter, more effective, and cost-effective use of storage. With this approach, all storage technologies will be supported consistently, regardless of vendor or architecture, and can be deployed on new or existing servers, turn-key appliances or in the cloud.

Furthermore, in the current business environment created by the COVID-19 crisis, SDS has proven to be a powerful tool in helping IT departments achieve cost savings and get maximum value from their existing investments. As IT spending begins to ramp back up in 2021, SDS will continue to function as a major cost saver for IT teams, helping them get more predictable performance as they scale.

Enterprises should then modernize and automate legacy storage systems by proactively using advanced storage technologies to drive infrastructure-led innovation. As such, SDS will become necessary in implementing a modern software-defined infrastructure that will make the best use of existing resources while future-proofing the IT environment for years to come. In fact, Gartner's 2020 Strategic Roadmap for Storage states that 50% of global storage capacity will be deployed as software-defined storage on-premises or on the public cloud by 2024, up from less than 15% in 2020.

Intelligent Backup and Recovery Needs Grow

2021 will see growing usage of intelligent backup and recovery technologies as data protection increasingly becomes top-of-mind. Backup has been an oversight for decades, but as data volumes continue to grow and become more distributed, recovery objectives and operational complexities increase dramatically alongside them. While backup and recovery is now a key workload, legacy data protection approaches need to adapt to modern day infrastructures or be left behind. Yet the question that still remains is how to do it efficiently.

Nearly every data center management team aims to strategize and institute a robust backup and disaster recovery plan to meet contingencies that cause downtime and data loss and disrupt business continuity. The objective of a data backup, replication and disaster recovery solution should be to restore business services with near-zero RTO. This is only possible when the data protection and management solution is highly reliable, efficient, secure, and flexible to support the business requirements and evolving infrastructure in the organization.

IT departments are realizing the implication of frequent backup copies in terms of storage capacity and performance impact, which is driving efforts to find optimized ways to take snapshots, move them to more efficient storage, and lower total costs. At the same time, there is increased recognition that the value of backups, or any technology that helps with availability, does not come from the backup process itself, but from the ability to restore back to a good state in a quick and efficient manner.

In environments where there is a mixed breed of storage hardware powering virtualized application workloads, ensuring fast and seamless backup is particularly challenging. As a result, IT organizations will continue to move away from legacy backup and replication systems to more modern and sophisticated solutions to benefit from intelligent data management features that allow IT teams to achieve greater flexibility and increased ROI.

Classification of Data for Compliance with GDPR and Other Privacy Laws Will Increase the Value of Metadata

Companies typically have data scattered all over the place. There are many compelling reasons to aggregate/consolidate that data to create a unified view across all systems but, the need to classify data to comply with regulations such as GDPR (General Data Protection Regulation) and CCPA (California Consumer Privacy Act) is becoming one of the most critical.  

These trends are pushing the value of metadata increasingly higher. Metadata can be very powerful, especially when tightly integrated with the data from a logic perspective, yet decoupled from a location perspective, while synchronized globally-and when it is available for all types of data whether it was stored using an object or a file protocol. 

Metadata-driven management provides the foundational tools required to build rich attributes around the data store. For GDPR compliance, keywords and tags associated with data can be used to define what is privileged or personal information; if information should remain within a country or can be moved more broadly; if the data needs to be encrypted; and simplifies finding data that must be deleted to comply with privacy laws-all without actually having to look at the data.

Coupled with artificial intelligence (AI) and machine learning (ML) technologies, telemetry and content scanning software, metadata can be enriched with searchable tags and other valuable attributes based on location, time, utilization, data characteristics and others. As a result, companies can better enforce data governance policies using metadata-driven data management.

Metadata-driven data management of distributed files and objects will become an important element of modern storage technology in part because it helps drive the automated decisions on the value of data and where it should be placed, while adding several dimensions to auto-tiering intelligence. IT should look to take advantage of the next generation of tools that enable this-not as a plug-in or a separate piece of software, but a core part of the storage architecture of modern file and object systems.

The Cloud is not a Panacea; Primary Use Cases for On-Premises Applications Will Exist for A Long Time

The cloud was originally touted as a less expensive storage option. However, that concept was quickly dispelled when IT started using more cloud resources and the associated costs showed quite clearly that the cloud was not always the most cost-effective infrastructure option.

Today, enterprises have moved a portion of infrastructure data back on-premises where they have more control and better economics. Yet, the cloud still offers formidable simplicity, agility, and yes, cost efficiency in many cases - one of which can be long-term secure data storage.

The IT industry has become increasingly smarter about what belongs in the cloud and what does not. However, this decision often involves either a move to the cloud or to keep data on-premises as it has been very difficult to build truly hybrid systems-especially as it relates to data. Imagine a file storage system that has been in use for years and has millions of files, some of which require immediate accessibility and some of which must be archived-but it is almost impossible to determine which is which.

Modern data management tools and software-defined storage that spans multiple public clouds, private clouds, multi-cloud deployments, and on-premises infrastructure will help the industry reach a level of maturity. This will be made possible by smart software that understands the profile of data, has the ability to access data anywhere, and therefore can move it in an automated fashion based on business rules.

Hybrid Cloud Requires More Mature Data Tiering

The hybrid cloud model mentioned above will also require more mature data tiering tools. This need will continue to grow as Gartner's 2020 Strategic Roadmap for Storage also predicts that by 2024, 40% will implement at least one hybrid cloud storage architecture, up from 10% in 2020.

However, there are currently very few tools that make the movement to the cloud transparent and automatic, essentially creating an additional data silo in the cloud. An intelligent hybrid system should move data to or from the cloud, as needed, to optimize cost and performance based on business needs, automatically and effectively. Ideally, data that has been moved to the cloud is always available, so it is not ‘shipped off' to a cloud but tiered intelligently while still being part of the same logical system.

A software-defined system becomes the unifier across storage systems and the intelligent layer that controls data placement in an optimal way. Vendors who can span the public/private divide will have an advantage.

Flexibility and Cost Savings Remain Critical in Economic Recovery

Naturally, many IT projects were put on hold or reprioritized due to the global COVID-19 pandemic. While storage needs continued to grow (potentially at an even greater pace with the surge in remote work, learning, etc.), purchases and upgrades were understandably delayed as companies tried to preserve cash and lower expense profiles. The primary benefits of flexibility and cost savings inherent in technologies such as software-defined storage rose to the top to more effectively support this unplanned infrastructure shift.

A software-defined approach enables IT to consolidate all their existing hardware into a large storage pool, where capacity utilization can be optimized across different storage systems. Modern SDS platforms enable intelligent auto-tiering to dynamically move data to higher-performance or lower-cost storage as needed. Thin-provisioning allows more efficient utilization of storage, making the best use of existing capacity.

The best SDS technologies help improve performance and reduce latency, extending the life of existing systems while meeting business needs. The ability to combine dissimilar systems to create a highly available system can also result in significant savings.

The ability for IT to manage their storage through one integrated system, simplifying provisioning, and eliminating manual tasks like data migrations, results in very significant time savings, which reduce the need to increase staffing and allow existing teams to focus on higher-value tasks.

SDS can usually be acquired with term licenses, lowering the entry point for adoption and allowing IT to become adaptable to multiple situations. As these benefits are increasingly realized, technologies that support flexibility and cost savings will continue to play a key role in the short and long term as the industry stabilizes.

Moving from Hardware-Confined to Software-Defined

Many hardware companies are becoming software companies as the industry realizes the true value is clearly in the software- and organizations increasingly move away from solutions that create hardware dependencies.

As this becomes clearer in the market, organizations are taking a deeper look at how their IT infrastructure solutions are deployed and operated; for example, if they are limited by data services, restricted by support for diverse storage equipment, and more.

For software-defined storage products specifically, there are different levels of maturity. Where they land on that spectrum directly impacts how they enable IT to consolidate capacity across varied devices under one command center and provide the freedom to scale the data center with a choice of storage based on cost, performance, compliance, or any other requirement.

The three types are:

1.       Storage OEMs manufacturing SAN or NAS systems that claim to offer SDS packaged with their hardware. In this case, data services are tied to a specific hardware or a family of equipment from the same supplier.

2.       Hyperconverged platforms that offer some level of SDS functionality, extending the reach of data services and other SDS functionality within the HCI cluster, but limited by the hypervisor.

3.       Purpose-built SDS solutions that are vendor-neutral and span the entire storage infrastructure regardless of the make or model of storage, or deployment model.

True SDS solutions allow organizations to extend the scalability, agility, and reliability of the storage infrastructure for unmatched coverage. From Fibre Channel to NVMe, and classic storage design to hyperconvergence, demand for the flexibility to adapt and modernize the data center without being locked into a particular hardware vendor or technology will only increase. It allows organizations to adopt new technologies alongside existing equipment to maximize their collective value-without delay or disruption to business services.

The Kubernetes Market will Have Two Flavors

As the use of Kubernetes continues to explode, the market is increasingly dividing itself between IT Operations (ITOps) and DevOps.

Traditional ITOps teams will require support for Kubernetes workloads for new applications on existing enterprise storage technologies. These teams will use a CSI driver as the interface to Kubernetes. Enabling container teams with a CSI driver is quickly becoming an industry standard; every storage system and every IT department needs to have one.

The other camp is DevOps teams: cloud native, services oriented, container-tech-centric teams who want a container-native persistent storage solution. As container applications move from experimental projects to production, this segment is becoming more mature as their storage needs start to mirror those of ITOps teams in terms of security, availability, and performance, including backups, CDP, DR, etc. There is also a need for better configuration, monitoring, testing, and operations management tools.

Leading vendors in the open-source space that are backed by the CNCF and offer enterprise capabilities, management tools, and secure persistent storage will become more attractive to DevOps teams who prefer technology that can deliver on simplicity of deployment and broader hardware compatibility through abstraction.

##

About the Author

Gerardo A. Dada, CMO at DataCore Software

Gerardo A. Dada 

Dada is an experienced technology marketer who has been at the center of the web, social, mobile and cloud revolutions at some of the world's leading companies. Prior to DataCore, he served as vice president of product marketing and strategy at SolarWinds. Earlier, Dada was head of product and solutions marketing at Rackspace, where he established the company as the leader in hybrid cloud. He has also held senior marketing roles at Bazaarvoice, Motorola, and Microsoft. Dada received a five-year business degree from a UAEM University in Mexico and a general management certificate from University of Texas at Austin. 
Published Thursday, January 07, 2021 10:36 AM by David Marshall
Comments
There are no comments for this post.
To post a comment, you must be a registered user. Registration is free and easy! Sign up now!
top25
Calendar
<January 2021>
SuMoTuWeThFrSa
272829303112
3456789
10111213141516
17181920212223
24252627282930
31123456