By Brent Schroeder, Global CTO, SUSE
Enterprise adoption of edge computing
is still in its infancy. Even with 15 billion edge devices deployed in field (IBM 2022) and strong appetite across sectors, today's Edge
landscape is nothing compared to the future scale of the market. We are on the
cusp of tremendous growth as hardware improvements, ever more accessible
connectivity, and the influence of cloud-native approaches promise to unlock innovation.
The latest research suggests that worldwide spending on edge will eclipse last
year's $208 billion to reach $317 billion by 2026.
There is good reason for this:
innovators in every industry are discovering new use cases and building a
competitive advantage with next gen edge computing. Forward-looking
transformation strategies are moving closer to the point of interaction where real-time
analysis of data is utilized to drive new revenue and efficiency opportunities.
This will have profound implications for the success and competitiveness of
enterprises. Recent Accenture research found that 83% of C-suite executives saw
edge computing as essential to remaining competitive. And time is running out:
81% believed that a failure to act quickly will lock them out from the full
benefits of edge. (Accenture:
Leading with edge computing, 2023)
Defining a Complex Landscape
While such recognition of edge's
growing significance to enterprises is encouraging, it remains a nebulous and
poorly understood term. At SUSE, we find
it helpful to break edge down into three distinct but overlapping areas: near,
far, and tiny.
The near edge is largely the realm of
telecommunications and closest to the data center with communications providers
bringing computing resources close to their own infrastructure. Beyond, we find the far edge, where deployments may stretch
into tens of thousands and we see the most diverse use cases across commercial
and industrial sectors. Examples include satellites, cell towers, and retail
points of service (POS). Finally, the tiny edge is closely tied to the
Industrial Internet of Things where small, fixed-function, and lightly
resourced devices are deployed at scale such as security cameras, environmental
sensors, and factory floor controllers.
Unlocking Opportunities with a Cloud-Native
Approach
Modern edge deployments share some
fundamental building blocks: geographic distribution, decentralized
architecture, often a need for low-latency processing - and scale. Customers typically
have 1,000's and can scale into the 10,000's of locations under management.
These characteristics present challenges not found in traditional on prem and
cloud models and enterprises grapple with the reality of managing the
complexity of applications across a myriad of devices, using
resource-constrained hardware, and the need for high levels of observability
and control. Yet these characteristics also underpin the opportunities offered
by edge when applications are much closer to the point of interaction and able
to unlock instant insight through real-time data analysis.
For innovators looking to realize the
competitive advantage offered by edge, the key to success lies in intelligent
deployment, scaling and securing of infrastructure across thousands of sites. To
do this, we need to leverage technologies and processes originally born in the
cloud to tackle the challenges presented by this complex ecosystem. A modern
cloud-native stack can deliver reliable, scalable edge solutions and full
lifecycle management built on three pillars: containers, Kubernetes to
orchestrate those, and automation running across the ecosystem.
At the edge, innovators
typically need to optimize both the agility and footprint of applications. Businesses
want to deliver new value fast and often. In
contrast to monolithic applications, containers pair exceptionally well with continuous
delivery models, enabling teams to rapidly deliver capabilities as well as
incrementally add value in rapid succession. Additionally, the smaller
footprint of containers and the ability to break down applications into more
granular services better suits software delivery to small endpoints over low
bandwidth networks often found in remote edge environments.
Once built, orchestrating and
lifecycling containerized applications across a large
distributed environment is far beyond the capabilities of manual processes.
Kubernetes has definitively won the container orchestration war. Across edge
solution deployments, little is static - locations come and go, deployments may
grow, and updates to applications and infrastructure happen at a rapid pace. A
framework is needed to automate and orchestrate this dynamic ecosystem without
the need to continually rescript control planes in languages designed in an era
of static infrastructure and monolithic applications. Implementing Kubernetes
at the edge is typically facilitated by purpose-built and lightweight operating
system and Kubernetes distributions to address limited hardware resources,
remote locations and offline operation (or weak connectivity). The most widely
adopted of these is K3s, which Rancher built and donated to the CNCF. Lastly, amid the wider attack surface
inherent to edge computing, security solutions that cover the full container
lifecycle are crucial: from secure software supply chains, to detecting and
remediating known and new security issues swiftly and at scale, to zero trust
solutions that protect against (yet) unknown issues.
Gaining Competitive Advantage at the Edge
With the right stack and tools in place, innovators are
then able to unlock the potential of edge computing and revolutionize
operations. The tiny edge is receiving a lot of attention amid the transition
to Industry 4.0 and serves as a useful illustration.
Compare the everyday experience of a manufacturer that
has embraced edge and one that remains tied to legacy infrastructure. One
brings its devices and sensors at a location into a cloud-native ecosystem,
gaining real-time, granular visibility and the ability to locally process
millions of data points in seconds. It sees reduced costs, improved efficiency,
lower downtime, better resource allocation, and less energy consumption. The
other does not understand its environment as
thoroughly, lacks comprehensive real-time data collection, and risks losing
share in an industry where resilience and cost-control are critical.
As another example telecommunications providers are
leveraging edge to deliver open and flexible communications infrastructure that
is future-proofed and highly scalable. SUSE's Adaptive Telco Infrastructure
Platform (ATIP) is optimized for this category, enabling rapid rollouts (and
updates) of 5G networks and simplification of operations at scale. This is
crucial as we look to how next generation networks will support the new use
cases expected to arise from 5G and wider adoption of multi-access edge
computing.
From a global perspective Edge computing may be in its
early stages - still it is already having a profound impact across various sectors.
Adoption is skyrocketing and it's no surprise that Gartner predicts that by
2025 75% of enterprise-generated data will be created outside of the
traditional data center or cloud (CNCF 2022).
It is exciting to see early innovators already
leveraging edge technologies to unlock new use cases and business models. Are
they looking too far into the future? We don't think so. We believe edge is crucial and those that are
planning for it are gaining a competitive advantage over those failing to
evolve.
++
Join us at KubeCon + CloudNativeCon North America this
November 6 - 9 in Chicago for more on Kubernetes and the cloud native
ecosystem.
##
ABOUT THE AUTHOR
Brent Schroeder, Global CTO, SUSE
As Global SUSE CTO, Mr. Schroeder is responsible for
shaping SUSE's technology and portfolio strategy in support of emerging use
cases in areas such as Hybrid Cloud, IoT and AI/ML. He drives the technology
relationship with numerous industry partners, participates in open source
communities as well as evangelizes the SUSE vision with customers, press and
analysts.
Mr. Schroeder brings to SUSE 30 years of technology
innovation and development experience in the IT industry. Prior to joining
SUSE, with Dell's Office of the CTO, he was responsible for software technology
strategy covering hybrid cloud, systems management, virtualization and
operating systems. Mr. Schroeder has also held various management and
engineering roles with NCR, Compaq and HP.
Mr. Schroeder holds Bachelor's degrees in Computer
Science and Business Administration from Iowa State University.