Industry executives and experts share their predictions for 2021. Read them in this 13th annual VMblog.com series exclusive.
Increase in Cloud-Native Apps Drives Key Challenges and Opportunities
By Ankur
Singla, Founder and CEO of Volterra
More organizations are evaluating or executing
a move to a cloud-native environment. There are many factors driving this
trend, but at a high level, three primary benefits stand out: quicker app
development, greater scalability, and lower operating cost.
Like any new trend in enterprise tech, the
move to cloud-native environments will come with a healthy amount of
disillusionment from companies that took the plunge, did not plan or execute
properly, and suffered major issues as a result. A few key trends will rise to
the forefront over the next 12 months as organizations get busy identifying,
addressing, and overcoming these critical challenges:
1. API sprawl jeopardizes the security of modern applications
As organizations continue to digitally
transform business processes, they are increasingly transitioning from legacy
applications to modern, cloud-native apps. These intricate modern apps feature
far more APIs than their predecessors. And since these apps are built with
extensive microservices, many of these APIs are deeply embedded and hidden.
This API sprawl has created many new attack vectors. Few vendors address app
security properly at the API level, leaving developer and security teams
scrambling to protect these apps. Traditional API gateways were designed for
app to web communication, not app to app communication, which is characteristic
of distributed, cloud-native environments. As a result, developer and security
teams must manually discover all APIs and enforce policies on them, a
cumbersome and error prone process.
In 2021, the industry will popularize a new
approach for securing modern, cloud-native apps: the use of machine learning to
automatically identify all APIs, no matter how deeply embedded or hidden, and
then enforce policies on each one. This will eliminate the difficult task of
manually identifying and enforcing policies for each API.
2. Growing understanding of service meshes accelerates cloud-native
transition
In 2021, organizations will become more
familiar with service mesh technology to help support successful cloud-native
adoption. A service mesh is an infrastructure layer used for managing, securing
and optimizing communication between microservices. It's critical that
organizations become proficient with the technology when transitioning to a
cloud-native approach, which typically leverages microservices-based app
architectures. With heavy use of microservices, cloud-native apps are much more
complex and harder to manage, connect and secure than legacy apps. Existing
point products, such as load balancers and web app firewalls, were not built
for modern apps. To properly manage communication between microservices in
cloud-native environments, enterprises will increasingly adopt service mesh
technology.
3. NetOps and SecOps help DevOps shoulder the burden for cloud-native
apps
Successfully executing a process as
complicated as cloud-native app adoption requires the involvement of many
different teams. Many enterprises think they only really need developer and
DevOps teams to drive cloud-native app adoption. As a result, they end up with
unsecured, poorly performing cloud-native apps, if they even get that far. In
2021, DevOps teams will deploy more collaborative infrastructure platforms that
will enable them to bring in NetOps and SecOps to help "share the load, but
without delays" to better transition to a successful cloud-native environment.
These groups will collaborate far more effectively and openly than they have in
the past.
4. Microservices finally start moving to mainstream as a second wave
of microservices adoption unfolds
The initial wave of microservices adoption was
driven by the trend towards APIs and SaaS-based products in the mid-2010s;
however, microservices did not see widespread adoption during this time and
mostly caught on only among advanced developer teams. Despite progress during
the initial wave of adoption, it's still been historically difficult to debug
and maintain apps designed with microservices. But in 2021, that will start to
change for several reasons.
One key driver is the growing emergence of
Kubernetes as the de-facto option for managing containers. Microservices are
part of Kubernetes' DNA - it is the primary method by which apps are developed
and deployed when using Kubernetes. Also, the growing distribution of apps and
infrastructure to hybrid cloud and edge environments will further boost
microservices adoption. The highly distributed nature of hybrid cloud and edge
settings make them a natural fit to use microservices to develop and support
apps there, as the very purpose of microservices is to provide an alternative
to large, often unwieldy, monolithic app architectures. The increasing rise of
multi-cloud will also lead to more microservices adoption. Due to these
factors, over the next 12 months, we will see microservices begin to move to
mainstream usage.
5. Kubernetes usage continues to lag for critical business apps
Even though Kubernetes has become a red-hot
trend among tech media and influencers, few organizations are actually
deploying Kubernetes for critical business apps. Even forward-looking
enterprises that have widely deployed containers, such as those in the tech and
financial services sectors, tend to only use Kubernetes for a small fraction of
their containerized workloads. Operational difficulties are the biggest issue.
Simply put, Kubernetes is very challenging to deploy, manage and run at
scale.
To make things more complicated, deploying
Kubernetes in production and at scale requires massive internal buy-in. Unlike
microservices, which are leveraged almost exclusively by developer teams and
don't necessarily require top-down approval from IT to adopt, Kubernetes
impacts an organization's entire infrastructure stack and often must be
greenlit by the CIO before it can be fully deployed. Kubernetes also has a
steep learning curve. It is a radical departure from VMs, which the majority of
IT pros still rely on to deploy and run infrastructure and applications.
Because of these factors, Kubernetes will not yet see widespread usage
supporting business apps next year. Eventually, though, organizations will
realize they can turn to pre-made Kubernetes clouds to overcome these
challenges, and full production deployments at scale will finally take
off.
6. Multi-cloud over multiple clouds: One lifts organizations up, the
other drags them down
Many enterprises claim they are multi-cloud
today, but in reality, they are just using multiple clouds individually and
paying multiple cloud providers. These organizations typically only run each of
their cloud applications at a single cloud provider (even if that application
may be in multiple locations with that provider). A real multi-cloud
architecture is best suited for modular, cloud-native apps: for example,
running microservice A of an app in Azure and microservice B in AWS. This "true
multi-cloud" approach, which will continue to gain steam over the next 12
months, will help organizations better embrace the strengths of each cloud
provider, allowing them to leverage critical specialties of each cloud. True
multi-cloud will also yield better availability for cloud applications (if one
cloud or service goes down, you have the app still running in another).
Specifically, in 2021, true multi-cloud strategies will allow organizations to
better leverage microservices and make them more effective.
##
About the
Author
Ankur is
the founder and CEO of Volterra. Previously, he was the founder and CEO of
Contrail Systems, which pioneered telco NFV and SDN technologies and was
acquired by Juniper Networks in 2012. Contrail is the most widely deployed
networking platform in Tier 1 telco mobile networks (AT&T, DT, Orange, NTT
and Reliance JIO), and is used in many SaaS providers' cloud deployments
(Workday, Volkswagen, DirecTV). Prior to Contrail, Ankur was the CTO and VP
Engineering at Aruba Networks, a global leader in wireless solutions. He holds
an MS in Electrical Engineering from Stanford University and a BS in Electrical
Engineering from the University of Southern California.