How to deploy a REST API AWS Lambda using Chalice and AWS Code Pipeline

This article shows how to deploy an AWS Lambda using the Chalice library and AWS Code Pipeline.

Table of Contents

  1. AWS Lambda with Chalice
  2. Deployment script
  3. Expected directory structure
  4. AWS Code Pipeline deployment step

In the first step, we’ll define the lambda function. Then we’ll write the deployment script and a definition of a Code Pipeline step to run the script.

AWS Lambda with Chalice

We have to create a new Chalice application and wrap our lambda function in the route decorator. We’ll put the Python code in the app.py file.

from chalice import Chalice

app = Chalice(app_name="predictor")

@app.route("/", methods=["POST"])
def index():
    return ""

In addition to the Python code, we must specify the configuration in the .chalice/config.json file:

{
  "version": "2.0",
  "app_name": "lambda-function-name",
  "autogen_policy": false,
  "automatic_layer": true,
  "environment_variables": {
      "NAME_OF_ENV_VARIABLE": "ITS VALUE"
  },
  "stages": {
    "dev": {
      "api_gateway_stage": "api"
    }
  }
}

If the lambda function requires any IAM permissions, we put the IAM policy definition in the .chalice/policy-dev.json file. Note that the dev part of the name refers to the API Gateway stage, so if we use a different one, we have to rename the file and change the stage name in .chalice/config.json.

Finally, we have to create a requirements.txt file with the lambda function libraries. Be careful because there is a 50MB size limit for the AWS Lambda deployment package (both your code and all dependencies).

Deployment script

Before we start, we create the requirements-dev.txt file containing the libraries used for deploying the code:

-r requirements.txt

boto3==1.14.12
chalice==1.20.0

After defining the development dependencies, we create the deploy_lambda.sh file in which we use Chalice to deploy our code:

#!/bin/bash

chalice deploy --stage dev

Expected directory structure

When we finish, the directory structure should look like this:

lambda
  - .chalice
    - config.json
    - policy-dev.json
  - app.py
  - deploy_lambda.sh
  - requirements.txt
  - requirements-dev.txt

AWS Code Pipeline deployment step

To deploy the code, we create an AWS Code Pipeline project:

EndpointDeploymentProject:
    Type: AWS::CodeBuild::Project
    Properties:
      Name: !Sub ${AWS::StackName}-pipeline-endpointdeployment
      Description: Deploys a Lambda
      ServiceRole: !GetAtt CodeDeploymentRole.Arn
      Artifacts:
        Type: CODEPIPELINE
      Environment:
        Type: LINUX_CONTAINER
        ComputeType: BUILD_GENERAL1_SMALL
        Image: aws/codebuild/python:3.6.5
      Source:
        Type: CODEPIPELINE
        BuildSpec: !Sub |
          version: 0.2
          phases:
            pre_build:
              commands:
                - echo "Installing requirements"
                - pip install --upgrade pip
                - pip install -r lambda/requirements-dev.txt
            build:
              commands:
                - echo "Running Chalice"
                - cd lambda
                - bash deploy_lambda.sh
            post_build:
              commands:
                - echo "Deployed"
          artifacts:
            files:
              - '**/*'
      TimeoutInMinutes: 30

Besides the build step definition, we must define the code deployment role and its permissions. It is better to limit the permissions as much as possible, but if you don’t care about configuring fine-grained permissions, you can use this configuration:

CodeDeploymentRole:
    Type: AWS::IAM::Role
    Properties:
      RoleName: !Sub ${AWS::StackName}-codedeploy-role
      AssumeRolePolicyDocument:
        Statement:
          - Action: ["sts:AssumeRole"]
            Effect: Allow
            Principal:
              Service: [codebuild.amazonaws.com]
        Version: "2012-10-17"
      Path: /
      Policies:
        - PolicyName: UploadAccess
          PolicyDocument:
            Version: "2012-10-17"
            Statement:
              - Action:
                  - codepipeline:*
                  - s3:*
                  - logs:*
                  - iam:GetRole
                  - iam:CreateRole
                  - iam:PutRolePolicy
                  - iam:PassRole
                  - lambda:*
                  - apigateway:*
                Effect: Allow
                Resource: "*"

In the end, we add a step to the AWS Code Pipeline:

- Name: Deploy_Endpoint
    Actions:
    - Name: EndpointDeployment
        ActionTypeId:
        Category: Build
        Owner: AWS
        Provider: CodeBuild
        Version: "1"
        Configuration:
        ProjectName: !Ref "EndpointDeploymentProject"
        InputArtifacts:
        - Name: dpl
        OutputArtifacts:
        - Name: enddpl
        RunOrder: "4"

Have you noticed that I deploy the code using a CodeBuild step? If you know how to get the same result using CodeDeploy, let me know!

Older post

How to deploy a Tensorflow model using Sagemaker Endpoints and AWS Code Pipeline

How to build a Docker image using AWS Code Pipeline and deploy it as an Sagemaker Endpoint

Newer post

How to predict the value of time series using Tensorflow and RNN

How to train the RNN model in Tensorflow to predict time series?

Are you looking for an experienced AI consultant? Do you need assistance with your RAG or Agentic Workflow?
Schedule a call, send me a message on LinkedIn. Schedule a call or send me a message on LinkedIn

>