Using localstack to deploy serverless application on local machine

In this article I will show you how to deploy aws serverless application on localstack

Setup

First, you should have installed the following tools:

  1. Install Docker.
  2. Install Serverless framework.
  3. Install the AWS CLI. Although we aren’t going to work with real AWS, we’ll need it to talk with our local docker containers.
  4. Once the AWS CLI is installed, run aws configure to create some credentials. You can use real credentials or fake credentials as well.
  5. AwsLocal: A thin wrapper around the aws command line interface for use with LocalStack.
  6. Serverless-localstack plugin: https://github.com/localstack/serverless-localstack.
  7. Create some files inside your project’s folder:

touch docker-compose.yml && mkdir .localstack

Configuring Serverless Localstack plugin

There are two ways to configure the plugin, via a JSON file or via serverless.yml. You can check the official documentation for a customized configuration. I have done it via serverless.yml file

service: myService
plugins:

Configuring Localstack Docker

You can run Localstack directly from the command line (cloning the repo), but I like using Docker because you don’t need to worry about downloading Localstack on your system. Here’s the config:

version: '2.1'
services:
localstack:
image: localstack/localstack
ports:
- "4567-4597:4567-4597"
- "${PORT_WEB_UI-8080}:${PORT_WEB_UI-8080}"
environment:
- SERVICES=${SERVICES- }
- DEBUG=${DEBUG- }
- DATA_DIR=${DATA_DIR- }
- PORT_WEB_UI=${PORT_WEB_UI- }
- LAMBDA_EXECUTOR=${LAMBDA_EXECUTOR- }
- KINESIS_ERROR_PROBABILITY=${KINESIS_ERROR_PROBABILITY- }
- DOCKER_HOST=unix:///var/run/docker.sock
volumes:
- "${TMPDIR:-/tmp/localstack}:/tmp/localstack"
- "/var/run/docker.sock:/var/run/docker.sock"

Running Localstack

Now that we have our docker-compose.yml, we can spin up the container: docker-compose up -d.
You can now access different AWS services through different ports on your local server. For the purposes of this tutorial, we only care about S3, found at localhost:4572.

Creating AWS resources

The next step would be to create the AWS resources you need to your application. We’ll create a bucket in S3 as example. You can use the following command to create a bucket:

aws — endpoint-url=http://localhost:4572 s3 mb s3://tutorial

When using the aws command, the endpoint-url argument is required to specify that you want to create a bucket in your LocalStack instance. However, if we installed awslocal previously we can do the same without specifying the endpoint url each time.

awslocal s3 mb s3://tutorial

After the command runs, a new bucket named "tutorial" is created and ready to use. You can check it going to the URL: http://localhost:4572/tutorial

Deploying our serverless application

Once we have all our AWS resources created, we can deploy our serverless aplication locally with the following command:

sls deploy —- stage local

Invoking locally a lambda function

Finally, once we have our stack created, we will be able to invoke any lamba function locally with:

sls invoke -f functionName --stage local

That's it! Thanks for reading!