Virtualization Technology News and Information
Article
RSS
Spectro Cloud 2024 Predictions: Kubernetes at the Edge

vmblog-predictions-2024 

Industry executives and experts share their predictions for 2024.  Read them in this 16th annual VMblog.com series exclusive.

Kubernetes at the Edge

By Saad Malik, CTO and co-founder, Spectro Cloud

As we write this we're just about to head off to KubeCon for our fourth Edge Day, the colocated event all about edge computing and edge Kubernetes.

We're also just publishing our 2023 State of Production Kubernetes, where we dived into the current adoption and sentiment about using Kubernetes at the edge.

It feels like a good time to make some predictions for 2024 around edge computing. Let's dive in.

1. Businesses will wake up to edge's strategic value

Edge computing is far from new, and pioneers have been using Kubernetes at the edge in some form for several years. But it's certainly not common. We believe 2024 will change that.

Our new research data found that while only 7% of Kubernetes users have edge clusters in full production today, 42% are piloting or in the middle of deploying, and 44% are working on their edge plans.

Why? Because companies are increasingly seeing edge as strategic. 50% are investigating edge to deliver business process improvements such as cost savings, and 41% because edge is key to new connected solutions for customers. The value is clear, but it takes a mindset shift.

One interviewee told us:

"We're a transportation company, so the use cases for edge are obvious. Just getting things into the warehouse where you need up-to-date inventory information was a no-brainer for business value. But we weren't used to thinking about devices that could move around and do business logic. We needed a bit of time and experience to change that thinking."

In many cases, the recent wave of innovation in AI has prompted businesses to look again, with greater urgency, at moving workloads to the edge. Another interviewee told us: "Anything that is time-sensitive is a candidate for edge computing. Anything, for example, AI or BI really is an edge kind of case, with more time-sensitive data and real-time communication between devices."

At Spectro Cloud we're really excited about the potential of AI workloads at the edge, and have been for some time. Check out this blog or watch our on-demand webinar to hear why. Or for a little inspiration, check out this talk from the last KubeCon where agritech computer vision pioneers Tevel talk about their flying autonomous robots.

While in some cases new edge applications will be built using cloud-native patterns from the start, that won't always be true. Many customers tell us they need legacy VMs at the edge mixed with container workloads running on the same devices.

We're working with one major restaurant chain that came to us with VM support at the edge as a non-negotiable, and others will be in the same position, whether it's because the VM is supplied by a third party vendor, or is part of a legacy system that is too complex to refactor. This need has prompted us to build on the open source project KubeVirt, which lets you schedule VMs on pods just like containers.

2. Edge costs take center stage

When you start experimenting with edge computing, in your labs or a few pilot locations, it's natural to focus on getting your application actually up and running and validating that it works and helps the business.

But when you start doing edge at scale, you face a whole new challenge: cost.

Edge costs break down naturally into two parts.

There's the infrastructure cost itself: the hardware devices, cases, cabling, software and other elements that you deploy at the edge. Across thousands or tens of thousands of sites, this can really add up - particularly if you have to deploy multiple devices per site to achieve high availability (HA).

As one devops engineer we interviewed put it, "I was working at a company that had warehouses in a remote location that needed inventory management. It seemed like a good use case, edge Kubernetes would have solved the problem and would have unlocked other abilities like possibly some sort of machine learning onsite to process data. But it was quite a capital investment."

We know that capital costs can be prohibitive. This is one of the reasons we've developed a 2-node HA architecture, to cut those costs by 33% or more compared to conventional 3-node deployments.

But perhaps an even larger component of edge cost is in time. Time to get a suitably expert engineer to a remote location to install and activate an edge device. Time to go back to the site to troubleshoot every outage, apply every patch.

We're working with one manufacturer that produces equipment for dental clinics, a connected product. It came to us suffering from at least one day of downtime per month per clinic, which not only hurt customer satisfaction and cost the clinics revenue, but also meant the manufacturer was bleeding money to have field engineers visiting clinic sites across the country day after day. The backlog was growing because the team was small, and clinics were waiting weeks for a visit.

This is the real issue for organizations embracing edge, whether internally for operations, or as part of a customer-facing product or experience. It's the cost, and the human-resource and skills bottleneck that goes along with it.

Already pioneers are acknowledging this. In our research we asked about the main challenges with doing edge. Naturally security was the top challenge (and we'll get to that in a minute) but 35% of respondents picked among their top challenges that "field engineering visits to deploy and maintain edge computing infrastructure are too costly".

So in 2024 we'll see this trend play out. Edge adopters will increasingly demand solutions from edge vendors, whether that's zero-touch onboarding for new edge devices or safe zero-downtime remote upgrades. We'll see some edge projects stall out due to cost constraints, as the budget runs out or the ROI figures fail to add up. We'll see others ramp up their hiring and training for field engineering resources.

3. Edge security bites back

Conduct any survey of IT professionals and ask them what their top concerns or challenges are, and security will be number one, every time. Our research into edge Kubernetes this year was no different: it was the top challenge our respondents said they experienced in their edge projects.

It's hard to overstate the challenges that the edge presents for security. Here you have computing devices that may be open to physical attack, tampering or theft, or in the case of the tactical edge, even be deployed in enemy territory.

Edge devices may be disconnected, unmonitored by SIEM tools, and rarely patched. Yet increasingly they're shouldering the burden of mission-critical workloads, deployed with sensitive intellectual property like AI models, and carrying private data such as medical records or payment details.

One devops practitioner we interviewed in the education sector expressed their concerns: "Edge devices and platforms don't sit in our cloud or in our datacenter, and they are very private - there is a lot of proprietary logic on them, so we have to be more concerned about security, authentication, encryption, and the policies we implement. We are more concerned with edge applications and the Kubernetes clusters that control them."

The more organizations rush to deploy edge computing, the more the attack surface and risk grow. In 2024, we'll see security breaches that target edge deployments - and compliance violations as a result of poor security controls at the edge. We'll see smarter organizations prioritize security in their edge architectures and how they build out and manage the software stacks on their devices. Already, in our research the top factor cited for those choosing a Kubernetes management platform was "security and compliance" - ahead of both performance and cost.

If edge security is on your radar, here's a starting point: we recommend reading the white paper we published in partnership with Intel earlier this year, introducing the Secure Edge-Native Architecture (SENA). It describes controls from the silicon to the app that protect edge deployments against a range of exploits.

Get the data

So those are our three predictions. What did you think?

To find out more about the data points we used in this article, check out our new report or watch our KubeCon 2023 Edge Day keynote.

##

ABOUT THE AUTHOR

Saad Malik, CTO and co-founder, Spectro Cloud

Saad Malik 

Saad Malik is the CTO and Co-Founder at Spectro Cloud. His passion is leading teams that solve large-scale problems, using cutting-edge technologies in the areas of cloud, virtualization, containers, and distributed systems. In his twenty years of experience, Saad has successfully built and delivered transformational products for enterprises, service providers, and consumers. He is a big Sci-Fi fan and enjoys building LEGO spaceships with his kids.

Published Monday, November 06, 2023 7:37 AM by David Marshall
Comments
There are no comments for this post.
To post a comment, you must be a registered user. Registration is free and easy! Sign up now!
Calendar
<November 2023>
SuMoTuWeThFrSa
2930311234
567891011
12131415161718
19202122232425
262728293012
3456789