Reputation: 168
I have two stacks,
What is the official way of including a custom lambda layer into the pipeline stack so that it relays code location information back to my application stack?
I have followed the documentation to make regular lambdas work... found here: https://docs.aws.amazon.com/cdk/latest/guide/codepipeline_example.html?shortFooter=true
Some documentation and/or example code would be greatly helpful.
Can anyone point me in the right direction?
Thx
Upvotes: 2
Views: 2005
Reputation: 388
Sucks that no one has answered this question until now but I was trying to solve the same problem and came up with this, so I hope it helps someone like us :)
There are zillions of ways to skin this cat but I am trying to go for clean CICD with easy developer work. The route I chose was to build my Lambda Layer with a Code.from_docker_build()
object. I supplied a Dockerfile
I wrote, which can package my code into whatever code artifact I need and then CDK knows how to handle it. That becomes my Lambda Layer, which I can then consume in other stacks/lambdas.
So here's what you need to do:
Create a Dockerfile
in your repo which can build your code into an artifact.
Dockerfile
should finish by putting your single code artifact file into the /asset
directory. You should only have a tar ball or a zip or whatever, just 1 file that is your code in "artifact" form that can run in lambdause Code.from_docker_build()
as your code object in your function.
class YourLambdaLayer(cdk.Stack):
def __init__(self, scope: cdk.Construct, construct_id: str, **kwargs) -> None:
super().__init__(scope, construct_id, **kwargs)
# Create Lambda Layer
new_version = LayerVersion(self, construct_id,
layer_version_name=construct_id,
code=Code.from_docker_build(
path=os.path.abspath("./"),
file='Dockerfile'
),
compatible_runtimes=[Runtime.PYTHON_3_8],
)
# Create an export to use from other stacks
cdk.CfnOutput(self, f"{construct_id}-Arn-Export",
value=new_version.layer_version_arn,
export_name='your-cool-layer-Arn'
)
Everything else from the example in the link you posted should work as you expect it. The only difference is that you're using a docker image to bundle your artifacts instead of supplying a zip or something like that.
This is what a pipeline resource would look like for a Python function, built using a Dockerfile in the pipeline...
class YourCICDStack(cdk.Stack):
def __init__(self, scope, id, env=None,... **kwargs):
super().__init__(scope, id,..., env=env, **kwargs)
code_repo = Repository(self, 'a-git-repo-in-code-commit',
repository_name='a-cool-python-package-from-u',
description='you can be long winded whilst describing, if you like :)'
)
pipeline = CodePipeline(self, 'resource-name-goes-here',
pipeline_name='pipeline-name-goes-here',
docker_enabled_for_synth=True, # !!! important !!!
synth=ShellStep("Synth",
input=CodePipelineSource.code_commit(
repository=code_repo,
branch='development',
),
commands=[
"pip install -r requirements.txt",
"npm install -g aws-cdk",
f"cdk synth ..."
]
)
)
# Add the stages for deploying
your_stage = pipeline.add_stage(YourLayerStage(self, ..., env=env))
your_stage.add_post(ManualApprovalStep('approval'))
So now that you've got your pipeline publishing your Lambda Layer, you would use it from other stacks using either the from_layer_version_arn() or from_layer_version_attributes().
Those are both class functions, so you use them in your other stacks by doing something like
my_cool_layer_ref = LayerVersion.from_layer_version_arn(
cdk.Fn.import_value('your-cool-layer-Arn')
)
# Include it in your other stacks/functions
some_other_func = Function(...,
layers=[my_cool_layer_ref],
...
)
Upvotes: 2