Virtualization Technology News and Information
Article
RSS
Vcinity 2022 Predictions: Changing the Data Gravity Paradigm in 2022

vmblog predictions 2022 

Industry executives and experts share their predictions for 2022.  Read them in this 14th annual VMblog.com series exclusive.

Changing the Data Gravity Paradigm in 2022

By Steve Wallo - CTO, Vcinity

The past two years have certainly been challenging for many organizations and predicting upcoming advancements in technology at this point can only be based on very recent user experiences that, in my case, I have been observing in our customer base. A major trend that I would like to bring up is removing the necessity of having to collocate compute/application to their associated data in order to get better performance and a good user experience. The gravity problem associated with data and applications is changing for the better. Below are some of the things that I see possible in the near future.

  • Now as applications are being able to fully move between cloud(s) and edge/premise locations, we will finally see the data no longer having to follow the apps as they move. This has always been the shortcoming of these gains. With the data having to follow the app, these larger data sets consume so many resources and time it fundamentally minimizes the advancements. I see this issue being resolved. With recently created technology, the age-old paradigm of the compute engine and data resource needing to be collocated no longer exists. Now dynamic and agile application movement can be done while the data stays in place giving the apps full access and performance without WAN penalties. This finally enables the initial vision of a true multi-cloud and hybrid cloud environment from both the application and data perspective. This new paradigm will allow vast improvements in business resilience, workload efficiencies and user experience.
  • With the ability for compute to point to remote data, we can see the upcoming implementation of federated compute power processing as a single distributed workload. Now we can leverage compute across multiple clouds and multiple edges - all working together in real-time to solve a single problem. In addition to working off a common dataset, each processing node, regardless of location, can now share its own data product to the federation of nodes. Imagine a collective of processing instances brought up as needed pulling in from hundreds s or thousands of small devices across a smart city.  This could empower AI workloads in the event of a natural disaster or major event for solving/tracking problems as they unfold. This would enable immediate insights for faster actions and decisions without having to wait for perishable copies of data to move distances.
  • As 5G is rapidly evolving, we can see new microservices running at the near and deep edge all with coherent access to each other's data. Want to do geo-distributed AI/ML workloads with full awareness running across vast distances leveraging all data simultaneously, this is something we see becoming more and more prevalent. Imaging having any cloud accessing federated deep edge-based compute and its associated data for true cloud-edge data fusion. The era of the super powerful but "stranded" edge will be a thing of the past. We see this driving more advanced edge technologies now that instant accessibility from any edge to any other edge or cloud is becoming possible.
  • One of the other things we see changing is the role of data management. Not that data management is not important, it is and will continue to grow as data grows, but having to manage multiple copies of the same data scattered around the globe will subside. One of the bigger problems and costs is maintaining, storing and managing distributed data copies. Which one is correct, is it still needed and what about the storage to house it? These are all common questions and concerns. There is also the security element of multiple copies; the more copies of the same information, the larger the attack vector for it to be compromised. With the advent of common geographically accessible data sets the number of copies is minimized. This will reduce one of the complexities of managing enterprise data, as well as reduce the security target area of information.

As you can see, breaking the age-old paradigm of data needing to be near the compute to be used is changing. The concept of data gravity has been altered, allowing for faster, more immediate ways to collect information from data anywhere at any time. We should all be watching for new, interesting  use cases coming out of this and what users will be sharing with us next year. 

##

ABOUT THE AUTHOR

Steve Wallo 

Steve Wallo currently serves as Vcinity's CTO, overseeing resources related to the insertion of advanced technologies and strategies into customer architectures and future IT decision methodologies. He is responsible for bridging future IT trends into the company's existing portfolio capabilities and future offerings. Prior to Vcinity,  Wallo was the CTO at Brocade Federal, responsible for articulating Brocade's innovations, strategies and architectures in the rapidly evolving federal IT space for mission success. Wallo has served the U.S Government  as the chief architect for the NAVAIR Air Combat Test and Evaluation Facility High Performance Computing Center.

Published Friday, January 14, 2022 7:37 AM by David Marshall
Comments
There are no comments for this post.
To post a comment, you must be a registered user. Registration is free and easy! Sign up now!
Calendar
<January 2022>
SuMoTuWeThFrSa
2627282930311
2345678
9101112131415
16171819202122
23242526272829
303112345