Industry executives and experts share their predictions for 2021. Read them in this 13th annual VMblog.com series exclusive.
In 2021 Edge Will Skill Up and Down
By Dale Kim, Sr.
Director of Product Marketing at Hazelcast
In every
sense of the word, 2020 has been a monumental year. For many of us, our
personal and career lives blended together as we moved our work home. For those
of us who work in enterprise technology, this sector experienced disruptions of
epic proportions. Digital transformation initiatives accelerated in some
instances by three to four years, and businesses grappled with cobbling together systems that IT teams
will spend the next few years perfecting.
As we
look back on 2020, what's certain is edge computing began to show its value in
the enterprise. In fact, as Forrester recently noted, edge computing is about to receive its diploma in practical business
use, driven largely by AI and 5G.
To expand
on Forrester's forecast, there are two key trends within edge computing I
believe will impact businesses:
- Edge Skill-Up:
Where the basic functionalities provided in these devices will evolve to add
more intelligence.
- Edge Skill-Down:
Where data that requires heavy compute power that can't be processed in the
device is more efficiently sent to a centralized analytics site.
Let's
dissect what these entail, what's happening in the industry to influence these
trends, and how they will evolve in 2021.
Edge Skill-Up
In short,
edge skill-up centers on how devices will become more intelligent for data
processing at the edge. This is where enhancements in machine learning (ML) and
"edge-optimized" software provide extensive functionality to enable action at
the edge site immediately. The side benefit is a reduction in data transmission
back to a centralized data center. The key challenge of edge skill-up is the
limited physical space at the edge, so you are constrained to small-footprint computer
hardware, and thus to less compute power. This means businesses must consider
investing in technologies like stream processing engines, embeddable in-memory
data stores, etc., that are architected for edge computing.
An
example of how this is used is in device logins. Not long ago, you could only
lock a phone with a passcode. Whereas newer mobile devices use facial
recognition to grant access. And the advanced login capabilities do not end
there. As the hardware and software in these devices update, ML algorithms will
better recognize accessories like glasses or in today's world face masks,
allowing for faster acknowledgement of the face and access to the device. It's
anticipated all future edge devices - the enablers that transform the way
people and enterprises interact with data quickly - will operate this way.
This can
also be seen on a larger scale in the oil and gas industry. The operating costs
of a drilling rig are very high and any suboptimal operation, including
downtime, can have significant impacts on the bottom line. Rigs equipped with a
large number of sensors to detect small vibrations during the process can
gather, monitor and analyze real-time high-frequency data used to feed analytic
systems. Specialized algorithms make very fine-tuned adjustments to the
drilling process, such as the real-time adjustment of the RPM of the drilling
string and bit. Data is also delivered to the base data center for collective
analysis to enable preventive maintenance. What's more, with added platforms like
a stream processing engine in their tech stack, converting this moves even
faster - a process we've seen come alive with companies like SigmaStream.
Edge Skill-Down
Where
skill-up centers on the devices, edge skill-down focuses on data delivery, and
how devices will utilize 5G speeds to offload heavy processing needs. This
means the complex compute work that takes place will be more centralized in the
data center. This is much the opposite of the complex processing capabilities
made available in edge device skill-up. Edge devices today often filter and
aggregate data to reduce the amount of data that needs to be sent to the
central data center, but that processing can result in lost data resolution.
With faster network access, devices can do less preprocessing so a higher level
of data granularity can be delivered to achieve greater analytical accuracy.
Retail
provides a great example in fraud detection. A card terminal (edge device) can
process the transaction, but it can't recognize if that card is being
fraudulently used. Instead, that data must quickly be sent to the base data
center and analyzed by complex algorithms running on a large history of data on
prior transactions to signal a red flag. The emergence of 5G will cut down on
latency in transferring this data, but there are still latency challenges
around running the complex fraud algorithms, especially with high data and
transaction volumes. This is where adding a layer in the central data center
like an in-memory computing platform starts to make sense as it greatly reduces
overall latency while increasing system uptime, with or without 5G.
Make no
mistake, edge computing is on the cusp of experiencing massive business
adoption in 2021. As data expands and progresses at an incredible rate,
enterprises that want to stay ahead need to incorporate infrastructure that is
fast, reliable and scalable. It's worth noting that in tandem with edge
computing, in-memory technologies are poised to make their mark next year. Any
company that seeks to prepare for edge skill-up and down would be remiss not to
consider this as part of their digital transformation strategy.
##
About the Author
Dale Kim is the Senior Director of Technical Solutions at Hazelcast and is responsible for product and go-to-market strategy for the in-memory computing platform. His background includes technical and management roles at IT companies in areas such as relational databases, search, content management, NoSQL, Hadoop/Spark, and big data analytics. Dale holds an MBA from Santa Clara, and a BA in computer science from Berkeley.