Industry executives and experts share their predictions for 2020. Read them in this 12th annual VMblog.com series exclusive.
By
Jason Shepherd, VP of Ecosystem at ZEDEDA
Edge Computing Accelerates, Bringing Benefits to Enterprise Technology
As we look ahead to what 2020 will hold for
the enterprise IT space, it's clear that IoT, AI, and edge computing are going
to become even more important components of the technology stack. Businesses
are realizing the power and value of being able to gather data through a
growing number of IoT devices and then analyze and act on it by processing it
right there at the edge, avoiding the latency and bandwidth issues of having to
send it all back to the cloud. Gartner recently named the empowered edge as one of the Top 10 Strategic
Technology Trends for 2020, while Forrester is predicting that "2020 will be the most
interesting year yet for vendors and users in this exciting space."
With the explosive growth predicted for edge
computing-soon, more enterprise data will be created and processed outside of data centers
than within them-it's inevitable that there will be some growing pains. This is
why we'll see an increase in interoperability efforts this year, along with IT
and OT practitioners finding new ways to collaborate. As other technologies
like 5G and AI continue to mature as well, the interplay between them and edge
computing will be important to watch and may yield some unexpected results. In
tandem with all of these developments, we'll see the conversation around data
trust and ownership pick up steam as both the volume of data being generated
and processed at the edge and the applications it's used for proliferate.
Here's more detail on five predictions for
edge computing and IoT in 2020.
1. Interoperability efforts kick into high gear
Over the past several years, the
IoT market has consisted of a paralyzing landscape of platforms reinventing foundational
capabilities for data ingestion, security, and management, mixed together with
applications and domain expertise. 2019 was a turning point, with many
providers feeling the pain of trying to own all aspects of a solution and end
users realizing that it's important to take control of their data the moment
that it is created at the edge. In 2020, we'll see more effort placed on how
best-in-class pure-play offers interoperate as part of a broader ecosystem that
also mitigates lock-in to any given component. This will be facilitated by
industry efforts like LF Edge that promote open source collaboration to
facilitate interoperability through open, vendor-neutral APIs in both the
infrastructure and application planes.
2. 5G does not replace the need for edge computing
5G is one of the most
talked-about topics in networking right now, but the reality is that the hype
is outpacing the near-term impact. It will undoubtedly be a transformational
technology, but it's not going to replace the need for on-prem edge computing
any time soon. The reason is because, even though 5G promises latency decreases
and bandwidth improvements, the majority of applications running at the edge
will still need some degree of localized compute for rapid, autonomous decision
making and to reduce the amount of data being backhauled, because bandwidth
always comes with a cost. In general, connectivity will require a holistic
approach, with 5G being augmented with private networks (e.g., BLE, LoRa,
Private LTE) bridged through localized edge compute nodes.
3. Computer vision is the killer app for AI at the edge
The buzz around AI
will continue to accelerate in 2020, including more announcements of
purpose-built acceleration silicon and toolsets to simplify both training and
inferencing workloads from edge to cloud. Computer vision will be the killer
app for AI at the edge: cameras are one of the best sensors around for deriving
rich information from the physical world, but issues with network bandwidth and
privacy will require analytics to be performed close to the source so only
meaningful events are backhauled and Personally Identifiable Information (PII)
is appropriately obfuscated. In addition to today's common use cases in
surveillance, autonomous vehicles, and robotics, computer vision at the edge
will power new capabilities and features previously unimagined.
4. Decrease in separation between IT and OT groups for
IoT management
The beginnings of
what we think of today as IoT can be found several decades in the past, with
internally-networked (originally, hardwired) devices that performed tasks like
controlling factory automation equipment. In that environment, management of
the devices and their workloads fell to the Operations Technology (OT) department.
Today, many early IoT proof-of-concept solutions are deployed via "shadow IT,"
with OT resources bypassing Information Technology (IT) colleagues using
cellular-connected cloud platforms in limited scale. However, as more
internet-connected devices generate data, concerns that more typically fall
under IT-like security, application and device management, and data
analytics-have to be addressed. IoT deployments require a unique hybrid between
OT and IT responsibilities, and while the divide between OT and IT has been a
roadblock to widespread adoption, in 2020 we'll see decreased separation
between these groups as it becomes increasingly clear that collaboration is
necessary. Edge virtualization will help accelerate this change, because it
allows both legacy and cloud-native applications to run on shared
infrastructure, further bringing OT and IT into contact with each other.
5. Real conversation around data trust and
ownership
With the proliferation of data driven by investments
in IoT and AI, the next big conversation is around data trust and
ownership-after all, if you don't have confidence in your data, you can't
extract maximum value from it. In 2020, we'll see increasing collaboration on
how we achieve trust across heterogeneous systems and stakeholders, which is
key to enabling data sharing and monetization, trusted workload consolidation,
and scaling the ability to meet compliance requirements such as GDPR. Achieving
trust in data at scale is about more than blockchain: it requires a system-level
approach, starting with silicon-based root of trust at the edge, which is where
all data is created. An example collaboration is the emerging Linux Foundation
effort Project
Alvarium, which plans to address the trust challenge by combining
various trust insertion technologies to deliver data from devices to
applications with measurable confidence, all while helping data owners maintain
privacy on their terms. On a related topic, we'll start to see regulators exploring
the need to mitigate the data gravity being amassed by a few large players in
the market. Ultimately, digital transformation is about balancing value
realized with privacy and IP protection, and we'll need both technology and
regulation to achieve this.
##
About the Author
Jason Shepherd is VP
of Ecosystem at edge virtualization company ZEDEDA.
Prior to joining ZEDEDA,
Jason was CTO for the Dell Technologies Edge and IoT Solutions Division. His
proven track record as a thought leader in the market is evidenced through his
leadership building up the award-winning Dell IoT Solutions Partner Program and
establishing the vendor-neutral, open source EdgeX Foundry project to
facilitate greater interoperability at the IoT edge. In addition to serving on
the board of the Linux Foundation's Edge umbrella project, Jason is active in
the Industrial Internet Consortium (IIC) and other key industry efforts focused
on IoT and edge computing. He was recognized as one of the Top 100 Industrial
IoT influencers of 2018. He holds 14 granted and 20 pending US patents.