Serverless Applications and Docker

How to Scale the Latest Trend in Infrastructure

Christina @ wocintechchat.com (opens new window) / Unsplash (opens new window)

Serverless applications are all the rage in cloud architecture circles at the moment. It's easy to see why: This method of cloud computing provides both DevOps teams and application developers with loads of flexibility, on-demand scaling, and quick time-to-market.

But what are serverless applications, and how to they integrate with containers? Here's what you need to know.

# What are Serverless Applications?

Serverless applications, or Function as a Service (opens new window) applications, provide backend services on an as-used basis, usually for a single operation or job. So while, of course, serverless applications still use servers, the rates for usage are set by the amount of use, rather than a fixed amount of bandwidth or a number of servers.

Serverless applications (opens new window) run at a lower cost because they do not charge for unused space or idle time. They offer simplified scalability, because their vendor will handle all scaling requests on demand, spinning up new services as requests from end users increase.

Most serverless providers — AWS, Google Cloud, and Microsoft Azure being among the more popular — also provide a variety of services such as databases and storage that integrate seamlessly with their servlerless products.

Similarly, serverless applications often have simpler backend code and quicker turnarounds for changes, which in many cases, can contribute to a faster time to market.

# How Do I Implement Containers in a Serverless Environment?

Although originally Docker did not have a serverless integration, things changed in 2017 (opens new window) when Azure Container Instances (opens new window) and AWS Fargate (opens new window) came to market.

Because Docker containers often manage services that are meant to run in the background, serverless applications are an excellent fit for containerized apps. Dockerized systems can be sent to serverless applications, creating a clean and predictable 1:1 relationship between application and infrastructure. The containerization of functions allows services to perfectly match demand and thus run at a lower cost (opens new window).

Ideally, containers would be best deployed in certain key instances (opens new window) where you want them to run as ephemeral jobs that start, do something, then stop. When you need long-lived computations or constant availability, spinning up Docker containers in a serverless app may not be performant or cost effective.

Most critically for your cloud bill, optimizing your container is key because billing starts as soon as you spin up your Docker container.

Many users of serverless applications actually consider the challenge of creating a framework for all of their functions to be the main pain point.

However, with Docker, developers have the ability to store all of their scripts into a bundle and then devote their energies to engineering an architectural framework in which they can all run together. Services like AWS Fargate can be intimidating to tackle, but once set up, offer a streamlined infrastructure that many developers find efficient and enjoyable.

# What's the Best Cloud Platform for Serverless Applications?

There are a variety of instances in which Docker and serverless applications overlap to create a high-powered software deployment.

# Amazon Web Services Fargate

The Amazon website (opens new window) describes AWS Fargate as “a serverless compute engine for containers that works both with Amazon Elastic Container Service and Amazon Elastic Kubernetes Service.” As per the benefits discussed above, this marriage of Amazon computing services and Docker allows you to place your focus on building your applications by removing the need to manage servers. Fargate is the most popular serverless application service at the moment. The recent release of their App Runner product (opens new window) helps coordinate build and deploy cycles.

# Google Cloud Serverless

Google's Serverless Computing Department (opens new window) advertises the following benefits: speed to market, simple developer experience, and automatic scaling. Although we have previously discussed some of these benefits, you may be noticing one emerging as a trend: simple developer experience. You want your developers to be expanding on your company’s work, not spending all of their time and talents managing your servers.

# Microsoft Azure Serverless

With the largest portfolio in the industry, Microsoft Azure (opens new window) prides itself on offering easy to run serverless applications that can be paired with Docker, just like their competition. While they maintain the same benefits as the services we’ve listed above, Azure has a reputation for offering the latest and greatest in cybersecurity (opens new window), as well, which is comforting for DevSecOps teams given the often frenetic pace of shipping that comes with serverless approaches.

# Other Serverless Computing Platforms

There are a variety of other serverless computing platforms that can be paired with Docker for incredible results and don't come from "the Big Three." If you want something more boutique for your stack, consider the following options:

# What Are the Challenges in Deploying a Containerized Serverless Environment?

Like any new way of functioning, there can be pain points that users should be aware of when creating a container-based serverless environment.

First, Docker or not, you should expect that engineers working with serverless might need additional time to be productive (opens new window). Because this technology is still relatively new, it is important to budget research and testing time for your developers to do their best work in serverless projects. Training and onboarding time is important for developers switching to this approach.

Additionally, where Docker users could previously get by without having any cloud knowledge, users looking to combine their powers with serverless applications will likely need to build their understanding of the cloud in order to do their best work. Basic understandings of cloud infrastructure, user permissions, and networking will best set up developers for success when leveraging serverless approaches.

Finally, and perhaps most critically, optimizing because billing starts as soon as you spin up your Docker container, image size is critical. If you have not optimized your image size by using container best practices (opens new window), then you will be getting charged for additional space that you are not using, and possibly opening yourself up to security risks.

# Stay on the Cutting Edge of Containers with Slim.AI

We know it and you know it: Serverless applications are trendy right now, and containers will need to work hard to ensure that they remain collaborative with these emerging services. Whether you’ve found a new way to **reduce your Docker image sizes ** (opens new window)to keep your costs down, or have tried out a new serverless application with your containers, we’d love to hear how things are going! Request access (opens new window) to our closed beta and build with us today.

Serverless Applications and Docker
Join our community
Subscribe to join our community, connect with like-minded developers, get early access to our products and
learn more about our open source projects.