Virtualization Technology News and Information
An Introduction To Serverless Architecture

By Mohamed Ahmed, founder and CEO of Magalix

What Is Serverless Architecture?

Since its inception, cloud computing was designed to solve a single major issue: scalability. It's a fact that the number of people accessing online services (including web applications, mobile APIs, etc.) is increasing exponentially. About a decade ago, engineers started resorting to the microservices architecture: breaking up large, complex applications into small, atomic, and interdependent components. The main benefit of adopting microservices was the ability to scale each part of the application independently. Containers were the most adequate technology for implementing microservices. Each service can operate in a container and the container can have as many replicas as needed to scale up or down. With so many containers and nodes, there was a need for a container orchestration system. A few of these orchestration system contenders eventually went to battle, and the ultimate winner was Kubernetes. Microservices, containers, and container orchestration systems, combined with the ease and elasticity of cloud computing, were able to address the scalability problem for a while. However, it wasn't long before there wasn't enough to scalability to accommodate the increasing demand for web applications and services. Hence, nanoservices came along - with nanoservices, engineers needn't concern themselves with which image their containers should use to run their applications and they no longer have to worry about what hardware will host their applications -- and that ultimately leads us to "serverless" architecture.

In the following diagram, we depict the classic cloud infrastructure offerings:

Infrastructure as a Service: is where you get to build and run your own virtual machines. The cloud provider is only responsible for the infrastructure that allows you to do that easily.

Platform as a Service: goes one step further and automatically manages the virtual machines and other components (e.g., load balancers) for you. You're only responsible for writing and deploying your application. Examples of this are AWS Beanstalk, Google App Engine, and Heruko.

Software as a Service: is where everything is set up for you by the cloud provider. You only get to consume the service, typically as part of a larger application. For example, you can use Dropbox as temporary (or permanent) storage from which your users get to download files.

Function as a Service: it's very much like PaaS, however, you don't have to pay for the servers, only the execution time.



If There Aren't Any Servers How Do Applications Run?

Although serverless linguistically means "without servers," the serverless architecture does not entail that applications will run without having any infrastructure! Rather, it means that the responsibility of operating this infrastructure, from deploying the physical servers to the container that actually runs the code, is delegated to a third party. In this case, developers only have to concern themselves with writing high-quality code. Then, all that they have to do is deploy their programs by simply copying and pasting (or uploading for large files) and the application runs. Behind the scenes, the serverless system launches a container with the appropriate image when the function receives a request. Then, the container executes the request and exits, freeing up any used resources to other functions.

What Defines A Serverless Architecture?

As with many trending technical terms, you may find varying definitions depending on whom you ask. However, in 2016 the serverless architecture was discussed in AWS's developer conference.

Eight rules were laid out that came to form the Serverless Compute Manifesto. For an architecture to truly be serverless, it must have the following features:

  • The function is the atomic unit of deployment: while microservices consist of services, Kubernetes uses pods, Serverless architectures consist of functions. A function is the smallest deployable unit. Each function should operate independently of other functions.
  • You only deploy a "function": which means, you are not concerned with the physical/virtual machine or the container on which the function runs.
  • Functions are ephemeral: they do not store any permanent data. If necessary, they must use an external service like a database.
  • Fault tolerance: the infrastructure on which the function runs must be fault-tolerant in a way that ensures high availability and robustness.
  • Inherent scalability: with an increasing number of requests, the serverless system should scale up automatically to enable more functions to run. If the load decreases, the resources should scale down accordingly to save costs.
  • Pay as you run: while the traditional payment model for servers/containers is to charge clients as long as the service is running, serverless systems should incur costs only as long as the function is actually running. No costs should result from a function's idle time.
  • Use your preferred language: developers should not be bound to a specific language or runtime environment. Instead, the serverless system should be ready to run code in all modern programming languages.
  • Visibility: any output or logs produced by the function should be available for the developers to review. This ensures speed in addressing bugs, or issues, as they occur.

Why/When To Use Serverless Programming?

Generally speaking, any application that can be run in containers (i.e., on a system like Kubernetes) can benefit from the Serverless architecture. But, the fact that you can do something doesn't necessarily mean that you have to do it. Let's take a look at the potential benefits and use cases of using serverless systems.

  • ETL and Big Data: big data, by definition, entails collecting, preparing, organizing, and storing huge amounts of data from various sources. Using serverless functions, you only pay for the data processing time. Since you don't pay for the idle time, you get to direct your costs to stuff that matters.
  • On-demand CI/CD: a very common CI/CD scenario involves the version control server (e.g., GitHub) notifying the build server (like Jenkins) of changes that occurred like new commits. The notification happens through webhooks. The problem is that the build server must be up and running 24/7 so that it's ready to receive the HTTP call when the repository server sends it. Using the Serverless architecture, the HTTP call can trigger a function that performs the necessary CI pipeline tasks (build, test, deploy, etc.). You only pay for the build time. In fact, there are many services that implement this model today like CircleCI and BitBucket Pipelines.
  • Web applications: certain types of web applications can benefit from the serverless architecture. For example, AWS provides the API Gateway, which is a layer for calling functions (implemented by another AWS product called Lambda Functions). A typical serverless web application works by placing static files (HTML, CSS, JavaScript, and images) on a dumb web server. Most deployment scenarios use an S3 bucket for this purpose. Any backend calls get routed to the API Gateway through AJAX, which in turn triggers the appropriate function to fulfill the request and send back the response. Ryan Kroonenburg, the founder of the online education platform acloudguru, has previously mentioned that they are using serverless architecture to deliver their content, which helps them greatly optimize their infrastructure costs.
  • ChatOps: this is a relatively new topic. It is used to describe a way of implementing changes and running commands through chatbots. For example, let's assume that your team needs to be able to issue a limited set of commands to a running Linux box or a Kubernetes cluster. The problem is that they don't have the know-how (e.g., they're not familiar with bash or they dislike kubectl). A possible solution is to use a chatbot that gets installed on the business messaging system (for example, Slack). The bot receives the command in plain English, and translates it to the appropriate system command and executes it against the target system. Serverless architecture is ideal here since every execution can be delegated to a function. As requests increase, the system can scale up automatically to cater to them.

Implementing a serverless infrastructure can be done in either of two ways:

  1. Through your cloud provider: every major cloud provider has a product that makes use of the serverless paradigm. We already covered AWS Lambda functions and the API gateway as an example, but there's also Azure functions and Google functions, both of which offer the same type of environment.
  2. Using Kubernetes: Since serverless functions use containers behind the scenes to do their magic, and since Kubernetes is - by far - the de facto container management system, then you can use your existing Kubernetes cluster to create a serverless architecture. You can break parts of your applications into functions that get executed on Kubernetes in the form of jobs. Since it's easier said than done, there are already tools that implement FaaS (Function as a Service) in Kubernetes like kubeless and knative. You may want to have a look at our ‘Implementing FaaS in Kubernetes using Kubeless' article, which contains a useful hands-on lab.

To learn more about containerized infrastructure and cloud native technologies, consider coming to KubeCon + CloudNativeCon EU, March 30-April 2 in Amsterdam.


About the Author

Mohamed Ahmed 

Mohamed Ahmed is the founder and CEO of Magalix. He previously led engineering and data services at Climate Corporation, where he helped solve infrastructure cost and performance optimization problems with a team-based approach. This inspired him to start Magalix to help developers and DevOps engineers understand their infrastructure, adopt best practices, and optimize their Kubernetes clusters with machine learning tools. Previously, he worked on developer experience products at AWS and Microsoft Azure. He holds a PhD in Computer Science and Engineering from the University of Connecticut.

Published Tuesday, February 25, 2020 10:40 AM by David Marshall
There are no comments for this post.
To post a comment, you must be a registered user. Registration is free and easy! Sign up now!
<February 2020>