Virtualization Technology News and Information
Article
RSS
Hydrolix 2025 Predictions: Data and Observability in 2025 - ROI Demands Will Cause Shift Toward Federated, Specialized Data Storage

vmblog-predictions-2025 

Industry executives and experts share their predictions for 2025.  Read them in this 17th annual VMblog.com series exclusive.

By David Sztykman, Head of Product, Hydrolix

The data explosion has been well publicized for years now. By some accounts, we'll likely hit an astronomical 181 zettabytes by year end, an increase of over 150% from 2023 alone, and the projected growth curve continues to accelerate steeply beyond the foreseeable future. Although I have my own cynical suspicions that too many organizations aren't sufficiently acknowledging and planning ahead for how this will impact them, I believe the explosion of data is already beginning to manifest as strategic pressures and deep structural changes in the data management industry. 

In 2025, enterprises will re-examine how they value and justify their data investments and consider new strategies for monetizing their data. This will bring about a reshaping of the architecture of back-end storage systems to deliver more nimble, federated solutions and a laser-focus on efficient, specialized data storage.

More Scrutiny on Observability Spend

With such surges in data volumes, cost and value are becoming core concerns for CIOs and CFOs. In 2025, observability spending will be under the microscope, particularly because of the expense of managing log data, which has increased in volume by 500% over the last three years. Driven by high storage and analytics expenses, CIOs and CFOs will be asking data teams what value the company derives from storing all that log data. What's the ROI?

With the high costs of storage for large volumes of data, it's all too easy to fall into the trap that only the most recent data is important for use cases like incident response and root cause analysis. And too often, enterprises try to cut down on costs by discarding data after a short period of time or quickly moving it to cold storage where it turns into dark data. Losing access to this data reduces its ROI and leaves teams without the data they need for historical analysis, brand protection, data science use cases, and more.

Instead, enterprises should be looking for ways to monetize the data the company generates, particularly by quickly transforming analyses into actionable ways to improve the customer experience. To do so requires infrastructure that can support rapid-access, high-frequency, high-volume queries and the flexibility to house various data types in-house rather than with third parties. And of course, they must be cost-effective, too, so that enterprises can keep their data available for long-term access and maximum ROI. All of these capabilities are possible through federated, specialized architectures. Which leads me to my next prediction.

The End of One-Size-Fits-All Data Platforms

A direct result of the cost-versus-value scrutiny is the inevitable demise of the one-size-fits-all data platform as companies seek to maximize efficiency for specific data types and storage functions. Federated storage, where data platforms are decoupled from the front end, will replace traditional, all-in-one systems. In this model, storage can evolve based on data needs and performance requirements without being shackled to a single, monolithic backend. The front-end experience remains unified, allowing analysts and data scientists to operate seamlessly across varied back-end systems tailored to each data type's demands. Log data, for instance, may work best in append-only, ETL-friendly stores, while CRM data might thrive in systems like Snowflake. 

This decoupling isn't just more efficient-it lets companies bring in best-of-breed data storage solutions without sacrificing ease of use. It's a powerful shift, particularly as analysts and data scientists increasingly drive these decisions, putting pressure on companies to cater to their tool and storage preferences. 

This harkens back to a well-known paradigm shift in the data industry: when AWS realized that SQL databases weren't meeting its need to keep shopping cart data accessible to their customers during peak shopping season. To meet its specialized shopping cart use case, AWS created DynamoDB, essentially the first NoSQL database and the start of significant evolution in relational databases. The point I'm making here is that we've long known that certain types of queries and certain analytical use cases can be more efficiently served from specialized data stores. 

So the days of simply bolting on storage to a monolithic Elasticsearch-type architecture are numbered. The new frontier is modular, federated and performance-oriented. This shift is more than just about data storage-it's a strategic approach to cost and performance, where choosing the right platform for the right data becomes essential.

What This Means for Decision-Makers

For those responsible for data strategies, the way forward involves adopting federated architectures and specialized storage that is both highly performant and cost-effective. Companies that prioritize these models will not only manage costs more effectively but will be positioned to capitalize on their data assets. In 2025, the ability to justify data costs and deliver high-performance solutions will separate the leaders from the laggards.

##

ABOUT THE AUTHOR

David Sztykman 

David Sztykman is Head of Product at Hydrolix and leads development of the core product as well as building partnerships. Prior to Hydrolix, he worked in solutions architecture at Elastic and Akamai. He lives in Paris.

Published Friday, November 29, 2024 7:32 AM by David Marshall
Comments
There are no comments for this post.
To post a comment, you must be a registered user. Registration is free and easy! Sign up now!
Calendar
<November 2024>
SuMoTuWeThFrSa
272829303112
3456789
10111213141516
17181920212223
24252627282930
1234567