Virtualization Technology News and Information
3 Ways DevOps Practices at the Edge Can Give Retailers a Boost this Holiday Season

By Sid Phadkar 

For many, the holiday season translates to cheer, festivities, and feasting. Any developer in the retail industry will tell you it also translates to code freezes and a hard pause on initiatives focused on operational excellence. It is often assumed that CI/CD pipelines, cloud migration strategies, and pretty much anything to do with DevOps is the last thing on the minds of development teams during the holiday season. However, in a world where 93% of organizations consider themselves to have adopted DevOps, this should no longer be the case.

Here are three ways DevOps practices at the edge that can give retailers a boost this holiday season:

1. Maintain Agility Amidst Last-Minute Changes

Whether it is the last few weeks leading up to or even the days right in the middle of the holidays, emergency fixes and enhancements are commonplace during this time of the year. Content delivery changes will most likely be involved since these changes directly affect businesses' performance, routing, and caching settings.

It is critical that these changes are treated just like any other infrastructure change and are integrated into the continuous delivery pipeline. Whether it is spinning up a new last minute campaign or refreshing an application when inventory runs out, the underlying edge delivery settings need to fit into automated workflows consistent with other vendor settings, tested early, and monitored in real-time.

DevOps teams need to be empowered to orchestrate changes within their managed Kubernetes cluster by unit testing and making pull requests within their local GitOps environment, automating deployments, and accelerating overall delivery without risking stability. The maturity of the continuous delivery pipeline for these integral content delivery settings defines how quickly a business can react to surprises and hiccups.

2. Provide Consistent Experiences at Scale

During the holiday season, a website or application infrastructure undergoes its most rigorous stress tests. Any glitch during this can be catastrophic to a brand and its annual sales targets.

But the risk of having an infrastructure overwhelmed by traffic spikes is higher today than ever due to autonomous distributed (microservice-based) applications teams controlling their destiny independent of others in a quest to increase release velocity. Often referred to as "concurrent DevOps," this is becoming increasingly common within most organizations today and empowers teams to make decisions on their own. Unfortunately in an API first world, this also means well-intentioned development teams sacrifice long term scale and stability in favor of time-to-market obligations - and these design choices can come back to hurt organizations at the most inopportune times.

A best practice in concurrent DevOps strategies is to have a layer in front of all of the applications, such as an API gateway, to ensure consistency in how distributed teams expose their applications. The challenge is that most gateways simply serve as router and don't address fundamental issues associated with scale, providing little to no value during traffic spikes or origins getting overwhelmed.

Leveraging an API gateway at edge to provide a first layer of defense helps keep applications healthy during the most consequential times and can provide key functionalities, such as authenticating, authorizing - and if necessary even throttling requests, all at the edge to ensure infrastructure does not go down during the most important time of the year. Distributed teams can worry less about the infrastructure while enabling them to focus their attention on underlying functionality logic.

3. Maintain a Firm Security Posture

Knowing full well the monetary opportunity that the season brings - not to mention the increased likelihood that application teams will forego comprehensive security testing in favor of putting out a new feature - malicious actors are also incredibly active this time of year.

For retailers, a vulnerability found at this stage could mean complete loss of customer trust and brand value. It is essential to enforce a common security posture across all applications to avoid this risk - however, doing so can often be time-consuming. Moreover, while the holidays period requires retailers to be more nimble compared to any other time of the year, traditional security providers struggle to provide the latest security definitions and capture real-time insights for external traffic their properties and applications enable.

Leveraging DevSecOps practices at the edge by maintaining the most updated security definition via insights driven at the edge, automating protection for any new applications, testing security policies early on in the development lifecycle, and making security a natural extension of the CI/CD lifecycle helps ensure agility without any compromises on security.

The holiday season requires retailers to be more ready for change than any other time of the year - without compromises on agility, security or scale. Extending the organizations' DevOps practices to the edge can help them achieve the most optimal combination of these critical aspects and sets them up for those worry-free days so they can enjoy all  of the festivities, cheer, and feasting the season has to offer.


About the Author

Sid Phadkar 

Sid is a product manager at Akamai focused on enabling delightful experiences for our developer base. He is currently focused on making Akamai an organic component of our user's continuous integration workflows as well as making Akamai the go-to platform for any API traffic needs for our customers.

As a Product Manager, Sid loves to understand the why behind things. He is passionate about making decisions informed by customer stories and data.

Prior to joining Akamai, Sid spent a few years consulting tech companies in optimizing development lifecycles and a few PM years at Dell EMC launching the company's first-ever subscription-based product offering aimed at hybrid cloud datacenters. Sid holds a Computer Science degree from UT Dallas and an MBA from Duke University (he has particularly strong opinions on Duke basketball). In his spare time, Sid can be found trying to make an impression in local pickup soccer leagues around Boston.
Published Thursday, October 10, 2019 7:28 AM by David Marshall
Filed under: ,
There are no comments for this post.
To post a comment, you must be a registered user. Registration is free and easy! Sign up now!
<October 2019>