Building Serverless with Docker

Espresso

Building Serverless with Docker

If you have experience with Azure Functions or AWS Lambdas, then the title may sound a bit like an oxymoron. One of the key benefits of “Function as a Service” (FaaS) or “serverless” offerings is developers do not have to worry about infrastructural concerns such as Virtual Machines, Containers, and the like. You create a function in your favorite language and ship it while effectively all administration is handled by your cloud provider. So, where does Docker come into play? In this article, we will discuss how you can use Docker and the serverless framework to supplement your local development experience.

Serverless

Several frameworks are available to create, build, and deploy serverless functions. I tend to gravitate towards the serverless framework. Not being native to Azure or AWS, the serverless framework has the following benefits.

  • Support for all major cloud providers such as Azure, AWS, and GCP
  • Strong community backing with plugins galore
  • Free and open source

It is also super intuitive and easy to use! While the serverless framework supports Azure, AWS, and GCP, we will be focusing on AWS as we advance. Getting started is a breeze. First, you will need to install the serverless CLI, which can be accomplished with the following command.

Plain text

Copy to clipboard

Open code in new window

EnlighterJS 3 Syntax Highlighter

npm install -g serverless

npm install -g serverless

npm install -g serverless

Next, type serverless, and a wizard will guide you through creating your new project.

The serverless framework is a topic in itself so, we will not go into details in this article. That being said, there is a wealth of information on their website. If you are new to the serverless framework, please take a look at their getting started guide and AWS provider documentation.

Serverless Offline

One of my favorite aspects of the serverless framework is the extensibility through its plugin architecture. Coupling this with its strong community support provides developers with plugins for the majority of their development needs. The most popular of these is the serverless-offline plugin, which allows developers to run serverless functions behind a local HTTP server. By simulating AWS Lambda functions and API Gateway, developers can quickly run and debug their serverless applications entirely on their own machine!

As is with most plugins, installation is very straight forward. First, we need to install the plugin with npm.

Plain text

Copy to clipboard

Open code in new window

EnlighterJS 3 Syntax Highlighter

npm install serverless-offline –save-dev

npm install serverless-offline –save-dev

npm install serverless-offline --save-dev

Next, the plugin needs to be declared in the serverless.yml configuration file.

Plain text

Copy to clipboard

Open code in new window

EnlighterJS 3 Syntax Highlighter

plugins:

– serverless-offline

plugins: – serverless-offline

plugins:
  - serverless-offline

An additional section can be added to the serverless.yml file to configure the plugin. Configuration changes are common to update the default behavior, such as the hosting port.

Plain text

Copy to clipboard

Open code in new window

EnlighterJS 3 Syntax Highlighter

custom:

serverless-offline:

httpPort: 9999

custom: serverless-offline: httpPort: 9999

custom:
  serverless-offline:
    httpPort: 9999

Visual Studio Code Remote Containers

Now that we have our serverless solution running locally, let’s start incorporating Docker into the mix. I use Visual Studio Code for the majority of my development work. Aside from being an excellent IDE, it comes with a handy feature called Remote Containers. Using a remote container allows us to do our development completely within a Docker container. Sorry guys, no more “it works on my machine” excuses.

Visual Studio Code walks us through creating the required files for running in a container. All we have to do is run the `Remote-Containers: Add Development Container Configuration Files’ command as shown below.

Once complete, we have all the required files for running a development container. Executing the ‘Remote-Containers: Reopen in Container’ command will reopen Visual Studio Code within our new container. If we open the terminal, we will notice it is a bash shell! As before, we can run our application with the following command.

Plain text

Copy to clipboard

Open code in new window

EnlighterJS 3 Syntax Highlighter

npx serverless offline

npx serverless offline

npx serverless offline

Pretty cool!

DynamoDB

As exciting as building serverless functions is, these functions seldom live in a vacuum. For instance, it is typical for a Lambda function in AWS to integrate with DynamoDB. In this case, how can I run my functions locally? Sure, I can point my Lambda to a Dynamo table in AWS, but this is not always desirable, especially for large teams. Enter DynamoDB local. DynamoDB local is a downloadable version of DynamoDB designed for local development. Lucky for us, Amazon also provides an easy to use Docker image, which we can run along with our serverless application. To configure serverless to use our local container, we must install another plugin, serverless-dynamodb-local. As with our other plugin, we need to install it with npm…

Plain text

Copy to clipboard

Open code in new window

EnlighterJS 3 Syntax Highlighter

npm install serverless-dynamodb-local –save-dev

npm install serverless-dynamodb-local –save-dev

npm install serverless-dynamodb-local --save-dev

…and add it to our configuration file.

Plain text

Copy to clipboard

Open code in new window

EnlighterJS 3 Syntax Highlighter

plugins:

– serverless-dynamodb-local

plugins: – serverless-dynamodb-local

plugins:
  - serverless-dynamodb-local

Lastly, we will need to tell our lambda to connect to our local DynamoDB instance. This will require some slight modifications to how the instantiate our DynamoDB client, as shown below.

Plain text

Copy to clipboard

Open code in new window

EnlighterJS 3 Syntax Highlighter

new AWS.DynamoDB.DocumentClient({

region: ‘localhost’,

endpoint: ‘http://localhost:8000’

})

new AWS.DynamoDB.DocumentClient({ region: ‘localhost’, endpoint: ‘http://localhost:8000’ })

new AWS.DynamoDB.DocumentClient({
    region: 'localhost',
    endpoint: 'http://localhost:8000'
})

This plugin also has some pretty cool features, such as schema migrations and seeding your tables. For more information, please take a look at their github page.

Bringing it all together

Now that we have serverless AND DynamoDB running in a container, how can we bring the two together? This is where Visual Studio Code comes to the rescue again! We can configure our environment to run multiple containers via docker-compose. These enhancements can be done by making a few small updates to the devcontainer.json file.

Plain text

Copy to clipboard

Open code in new window

EnlighterJS 3 Syntax Highlighter

{

“name”: “Node.js”,

“dockerComposeFile”: “docker-compose.yml”,

“service”: “serverless.app”,

“workspaceFolder”: “/workspace”,

// Set *default* container specific settings.json values on container create.

“settings”: {

“terminal.integrated.shell.linux”: “/bin/bash”

},

// Add the IDs of extensions you want installed when the container is created.

“extensions”: [

“dbaeumer.vscode-eslint”

],

// Use ‘forwardPorts’ to make a list of ports inside the container available locally.

// “forwardPorts”: [],

// Use ‘postCreateCommand’ to run commands after the container is created.

// “postCreateCommand”: “yarn install”,

// Comment out connect as root instead. More info: https://aka.ms/vscode-remote/containers/non-root.

“remoteUser”: “node”

}

{ “name”: “Node.js”, “dockerComposeFile”: “docker-compose.yml”, “service”: “serverless.app”, “workspaceFolder”: “/workspace”, // Set *default* container specific settings.json values on container create. “settings”: { “terminal.integrated.shell.linux”: “/bin/bash” }, // Add the IDs of extensions you want installed when the container is created. “extensions”: [ “dbaeumer.vscode-eslint” ], // Use ‘forwardPorts’ to make a list of ports inside the container available locally. // “forwardPorts”: [], // Use ‘postCreateCommand’ to run commands after the container is created. // “postCreateCommand”: “yarn install”, // Comment out connect as root instead. More info: https://aka.ms/vscode-remote/containers/non-root. “remoteUser”: “node” }

{
    "name": "Node.js",
    "dockerComposeFile": "docker-compose.yml",
    "service": "serverless.app",
    "workspaceFolder": "/workspace",
    // Set *default* container specific settings.json values on container create.
    "settings": { 
        "terminal.integrated.shell.linux": "/bin/bash"
    },
    // Add the IDs of extensions you want installed when the container is created.
    "extensions": [
        "dbaeumer.vscode-eslint"
    ],
    // Use 'forwardPorts' to make a list of ports inside the container available locally.
    // "forwardPorts": [],
    // Use 'postCreateCommand' to run commands after the container is created.
    // "postCreateCommand": "yarn install",
    // Comment out connect as root instead. More info: https://aka.ms/vscode-remote/containers/non-root.
    "remoteUser": "node"
}

Now we need to create a docker-compose.yml file. Of course, we can run any container our heart desires; however, three containers will do in our case. We will run one container for our serverless app, one for DynamoDB, and one for dynamodb-admin. Dynamodb-admin is a lightweight web application that is useful for managing your local DynamoDB instance.

Below is a copy of the docker-compose file I use.

Plain text

Copy to clipboard

Open code in new window

EnlighterJS 3 Syntax Highlighter

version: ‘3’

services:

serverless.app:

build:

context: .

dockerfile: Dockerfile

args:

VARIANT: 12

volumes:

– ..:/workspace:cached

command: sleep infinity

dynamodb.local:

image: amazon/dynamodb-local

ports:

– “8000:8000”

volumes:

– ./db:/home/dynamodblocal/db

command: [“-jar”, “DynamoDBLocal.jar”, “-sharedDb”, “-dbPath”, “/home/dynamodblocal/db”]

dynamodb.admin:

image: aaronshaf/dynamodb-admin

ports:

– 8001:8001

environment:

– DYNAMO_ENDPOINT=http://dynamodb.local:8000 dynamodb-admin

version: ‘3’ services: serverless.app: build: context: . dockerfile: Dockerfile args: VARIANT: 12 volumes: – ..:/workspace:cached command: sleep infinity dynamodb.local: image: amazon/dynamodb-local ports: – “8000:8000” volumes: – ./db:/home/dynamodblocal/db command: [“-jar”, “DynamoDBLocal.jar”, “-sharedDb”, “-dbPath”, “/home/dynamodblocal/db”] dynamodb.admin: image: aaronshaf/dynamodb-admin ports: – 8001:8001 environment: – DYNAMO_ENDPOINT=http://dynamodb.local:8000 dynamodb-admin

version: '3'
services:
  serverless.app:
    build: 
      context: .
      dockerfile: Dockerfile
      args:
        VARIANT: 12
    volumes:
      - ..:/workspace:cached
    command: sleep infinity
  dynamodb.local:
    image: amazon/dynamodb-local
    ports:
      - "8000:8000"
    volumes:
      - ./db:/home/dynamodblocal/db
    command: ["-jar", "DynamoDBLocal.jar", "-sharedDb", "-dbPath", "/home/dynamodblocal/db"]
  dynamodb.admin:
    image: aaronshaf/dynamodb-admin
    ports:
      - 8001:8001
    environment: 
      - DYNAMO_ENDPOINT=http://dynamodb.local:8000 dynamodb-admin

Now when we open our project in the development container, we can navigate to dynamo-admin by browsing to http://localhost:8001. All this provides us with the ability to run our serverless functions and DynamoDB locally with the stability of a local Docker environment!

This blog post was written originally on espressocoder.com Jason Robert.

Leave a Comment