Industry executives and experts share their predictions for 2019. Read them in this 11th annual VMblog.com series exclusive.
Contributed by Madhukar Kumar, vice president technical and product marketing, Redis Labs
Top 10 Data Predictions for 2019
The
tech industry regularly sees the rise and fall of several hype cycles including
the advent of the dot com era to cloud computing, big data and more recently
artificial intelligence (AI) and blockchain. Looking back, it is clear that
each one of these major changes was additive or in some way related to the
disruption that happened before. For example, AI would not be where it is today
without big data. Big data would not have been possible without the advent of
cloud computing and cloud itself would be non-existent without the world wide
web boom in the 90s.
Armed
with this hindsight, I believe we are about to make technology's next major
leap due to a number of forces coming together. Similar to past trends, all of
these are additive and built upon disruptions that have already happened or are
currently in play.
In a
nutshell, I believe we are headed into a zero
latency future. Now before you raise an eyebrow, let's define what that
means first and then look at all the individual trends I believe will together
make this phenomenon a reality. If a machine (hardware plus software) starts
interacting with humans or other machines in less than a second, it is a zero
latency device or app.
When
you talk to Alexa or Google Home today, the device often responds in less than
a second, but I think it could be even faster.Think about autonomous vehicles,
facial recognition, smart homes, etc., where everything needs to come together
in order to make a decision and act based on hundreds of inputs in a few
milliseconds. Now imagine that kind of computing is everywhere around you.
That, in my mind, is a zero latency future. In this future, any response times
over one second will be unacceptable.
So
what are the trends shaping this future? Let's first look at these emerging
megatrends.
1) Quantum
computing - Earlier this year, Intel announced a major milestone in
building quantum computing systems with 49 qubits -- enough to supersede
practical limits to modern computers. IBM and Google also made similar
announcements. Although we probably won't see an immediate replacement of
classic computers in the next year, IBM has already opened up a playground
where people can start experimenting with this new technology.
These developments are going to fast track opportunities for exponentially
faster compute processing power.
2) 5G
internet connections - Some providers, like Verizon, have already deployed
5G to a few cities in the United States. 5G technology builds upon lessons
learned from 4G and delivers speeds of 1GB uploads and downloads per second.
That's right, no more spinning wheels when you're watching cat videos on your
phone on the way to work.
3)
Persistent memory - Intel recently announced the launch of Optane DC
persistent memory, which looks like any standard RAM but can store terabytes
and even persist data when power is switched off. I am hopeful this technology
will continue to improve and eventually replace hard drives for the majority of
use cases. With this increased capacity, vasts amounts of data can be processed
in real time and persisted without ever touching a disk.
4) Real-time data processing at the
edge - Due
to the trinity of forces above, a lot more data processing will happen in real
time at the edge (i.e. in devices for autonomous cars, smart cities, facial
recognition, wearable tech and more). This phenomenon is often mentioned under
the category of edge or fog computing and will become more real as processing
gets faster, data becomes available in memory all the time and network speeds
increase exponentially.
5) Data processing within compute - In traditional big data
implementations, we saw programming logic move to the data (think MapReduce and
Hadoop). Now, I expect we'll begin to see the reverse. Data, and more
importantly data types will be pulled into compute for near-zero latency
processing because any latency from seeking data on a disk will no longer be
acceptable.
In
my mind, the five trends above will put us squarely in the middle of a zero latency future. That said, there
are additional trends to watch out for that will either spur or have a major
impact on the way we interact with computers in the near future:
6) Serverless architectures - Serverless processing of
large data sets will move more workloads away from big data to functions
orchestrated at scale with Kubernetes-like tools. This means that more
organizations will be able to process big data by utilizing
Function-as-as-a-Service (FaaS) solutions for better speed and affordability.
7) Multi-cloud - Multi-cloud adoption will
make data storage agnostic to cloud platforms and providers. Your data could be
stored partially on AWS and partially on Google Cloud or even on edge
computers, for example. More and more organizations will use technologies like
Kubernetes to break away from single provider lock-in.
8) Elimination bias in AI/ML - We will also see companies
with massive amounts of consumer data (Google, Facebook, etc.) try to sift out
bias from their data sets to make their AI and machine learning (ML) models
more accurate and bias-free. Today, for example, we can argue how a lot of bias
exists in how personal loans were granted in the last 50 years. If the ML
algorithms are allowed to learn from the data, you can almost be certain that
those biases will persist and this is one challenge all AI and ML providers
will have to overcome.
9) Data privacy - Due to today's large volumes
of data collection and instant processing requirements, data privacy will
continue to dominate many data storage and processing decisions. This year, we
saw the introduction of General Data Protection Regulation (GDPR) in EU that
had far reaching consequences on how companies collect and use private data. We
will see more traction on this both from the perspective of collecting and
processing data as well as additional government rules and regulations.
10) Event-driven architectures - Microservices
architectures will further evolve. For instance, as specific services
increasingly require the ability to work together with monolithic applications,
Mesh App and Services Architectures (MASA) are gaining in popularity. This
approach uses data services for listening to events and reacting to them in
real time.
The single biggest takeaway from these 2019
predictions should be that we are headed to a zero latency future. This is a an
exciting future because finally, just like electricity, we will start to see
real time computing become pervasively available and invisible at the same
time. This will require businesses to rethink how they collect and process
their data, all over again. Therein lies some of the biggest challenges and
opportunities.
##
About the Author
Madhukar
is a product strategist with a track record of successfully running product
management and product marketing teams for the last 10+ years. He has held
several leadership positions in small and large technology companies like
Zuora, HP and Oracle where he was responsible for building vision, go to market
strategies and opening up new markets for hyper growth. More recently he ran
product strategy and product marketing at Oracle for Customer Experience (CX)
products portfolio including Marketing, CRM, Commerce and Service. In the last
10 years he has also been writing in technology journals and speaking at
several industry events across the globe around disruptive new technologies
affecting businesses and the future of customer experience.