Tom Schreck
Tom Schreck

Reputation: 5287

AWS CloudFormation update Lambda Code to use latest version in S3 bucket

I'm trying to create a CloudFormation template supporting Lambda Function and AWS CodeBuild project for building .netcore source code into a deployed zip file in S3 bucket. Here are the particulars:

This is all working just fine. What I'm struggling with is how to update Lambda function to use updated compiled source code in S3 bucket.

Here is subset of CloudFormation template:

Resources:
Lambda:
    Type: AWS::Lambda::Function
    Properties:
        FunctionName: roicalculator-eventpublisher
        Handler: RoiCalculator.Serverless.EventPublisher::RoiCalculator.Serverless.EventPublisher.Function::FunctionHandler
        Code:
            S3Bucket: deployment-artifacts
            S3Key: RoiCalculatorEventPublisher.zip
        Runtime: dotnetcore2.1

CodeBuildProject:
  Type: AWS::CodeBuild::Project
  Properties:
    Name: RoiCalculator-EventPublisher-Master
    Artifacts:
      Location: deployment-artifacts
      Name: RoiCalculatorEventPublisher.zip
      Type: S3
    Source:
      Type: GITHUB
      Location: https://github.com/XXXXXXX
      BuildSpec: RoiCalculator.Serverless.EventPublisher/buildspec.yml

Here is subset of buildspec.yaml:

phases:
install:
    runtime-versions:
        dotnet: 2.2
    commands:
      dotnet tool install -g Amazon.Lambda.Tools
  build:
    commands:
      - dotnet restore
      - cd RoiCalculator.Serverless.EventPublisher
      - dotnet lambda package --configuration release --framework netcoreapp2.1 -o .\bin\release\netcoreapp2.1\RoiCalculatorEventPublisher.zip
      - aws s3 cp .\bin\release\netcoreapp2.1\RoiCalculatorEventPublisher.zip s3://deployment-artifacts/RoiCalculatorEventPublisher.zip

You can see the same artifact name (RoiCalculatorEventPublisher.zip) and S3 bucket (deployment-artifacts) are being used in buildspec (for generating and copying) and CloudFormation template (for Lambda function's source).

Since I'm overwriting application code in S3 bucket using same file name Lambda is using, how come Lambda is not being updated with latest code?

How do version numbers work? Is it possible to have a 'system variable' containing the name of the artifact (file name + version number) and access same 'system variable' in buildspec AND CloudFormation template?

What's the secret sauce for utilizing CloudFormation template to generate source code (via buildspec) using CodeBuild as well as update Lambda function which consumes the generated code?

Thank you.

Upvotes: 6

Views: 11771

Answers (3)

Gowsik
Gowsik

Reputation: 1126

There are two other options

  1. Add AWS CLI script in the Pipeline

    update-function-code

  2. New Deployment Options for AWS Lambda

Upvotes: 0

shariqmaws
shariqmaws

Reputation: 8890

Unfortunately, unless you change the "S3Key" on 'AWS::Lambda::Function' resource on every update, CloudFormation will not see it as a change (it will not look inside the zipped code for changes).

Options:

Option 1) Update S3 Key with every upload

Option 2) Recommended advice is to use AWS SAM to author Lambda template, then use "cloudformation package" command to package the template, which takes cares of creating a unique key for S3 and uploading the file to the bucket. Details here: https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/serverless-deploying.html

Edit 1:

In response to your comment, let me add some details of SAM approach:

To use CloudFormation as a Deployment tool for your Lambda function in your Pipeline. The basic idea to deploy a Lambda function will be as follows:

1) Create a a SAM template of your Lambda function

2) A basic SAM template looks like:

    AWSTemplateFormatVersion: '2010-09-09'
    Transform: 'AWS::Serverless-2016-10-31'
    Resources:
    FunctionName:
        Type: 'AWS::Serverless::Function'
        Properties:
            Handler: index.handler
            Runtime: nodejs6.10
            CodeUri: ./code

3) Add a directory "code" and keep the lambda code files in this directory

4) Install SAM Cli [1]

5) Run the command to package and upload:

$ sam package --template-file template.yaml --output-template packaged.yaml --s3-bucket {your_S3_bucket}

6) Deploy the package:

$ aws cloudformation deploy --template-file packaged.yaml --stack-name stk1 --capabilities CAPABILITY_IAM

You can keep the Template Code (Step1-2) in CodeCommit/Github and do the Steps4-5 in a CodeBuild Step. For Step6, I recommend to do it via a CloudFormation action in CodePipeline that is fed the "packaged.yaml" file as input artifact.

See also [2].

References:

[1] Installing the AWS SAM CLI on Linux - https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/serverless-sam-cli-install-linux.html

[2] Building a Continuous Delivery Pipeline for a Lambda Application with AWS CodePipeline - https://docs.aws.amazon.com/en_us/lambda/latest/dg/build-pipeline.html

Upvotes: 8

amittn
amittn

Reputation: 2355

  • I am using aws scp instead of aws cp and never had this problem.
  • I am working on a project with serverless architecture with multiple lambdas, where in we have multiple folder with just a python file and requirement.txt file inside it.
  • Usually the directory and lambda is named the same for convenience for eg. folder email_sender would have python file as email_sender.py and a requirement.txt if it needs one.
  • In the code build after installing the dependencies i am just showing below how we are ziping
      echo "--- Compiling lambda zip: ${d}.zip"
      d=$(tr "_" "-" <<< "${d}")
      zip -q -r ${d}.zip . --exclude ".gitignore" --exclude "requirements.txt" --exclude "*__pycache__/*" > /dev/null 2>&1
      mv ${d}.zip ../../${CODEBUILD_SOURCE_VERSION}/${d}.zip

  • And while doing a copy to s3 bucket we use scp as following
aws s3 sync ${CODEBUILD_SOURCE_VERSION}/ ${S3_URI} --exclude "*" --include "*.zip" --sse aws:kms --sse-kms-key-id ${KMS_KEY_ALIAS} --content-type "binary/octet-stream" --exact-timestamps

Upvotes: 0

Related Questions