Reputation: 1198
I have the following AWS CDK pipeline which works. It basically takes source from 2 different GitHub repositories (one for the application code, one for the cdk code) and builds the application code and the cdk code :
import * as cdk from 'aws-cdk-lib';
import * as codepipeline_actions from 'aws-cdk-lib/aws-codepipeline-actions';
import * as codepipeline from 'aws-cdk-lib/aws-codepipeline'
import * as codebuild from 'aws-cdk-lib/aws-codebuild'
import { Construct } from 'constructs';
import { CodePipeline, CodePipelineSource, ShellStep, CodeBuildStep } from 'aws-cdk-lib/pipelines';
export class CdkPipelineStack extends cdk.Stack {
constructor(scope: Construct, id: string, props?: cdk.StackProps) {
super(scope, id, props);
// parameters
const ConnectionArn = 'arn:aws:codestar-connections:region:account:connection/xxx-yyyy'
// Source
const cdkCodeStarConnection = CodePipelineSource.connection('owner/cdk-repo', 'master', {
connectionArn: ConnectionArn
})
const lambdaCodeStarConnection = CodePipelineSource.connection('owner/lambda-repo', 'master', {
connectionArn: ConnectionArn
})
// Building Lambda
const lambdaBuildStep = new CodeBuildStep('BuildLambda', {
buildEnvironment: {
buildImage: codebuild.LinuxBuildImage.AMAZON_LINUX_2_3
},
input: lambdaCodeStarConnection,
commands: ['variouscommands'],
partialBuildSpec: codebuild.BuildSpec.fromObject({
phases: {
install: {
"runtime-versions": {
python: "3.9"
},
"commands": ["python --version"]
}
}
})
})
const pipeline = new CodePipeline(this, 'Pipeline', {
pipelineName: 'Pipeline',
synth: new ShellStep('Synth', {
input: cdkCodeStarConnection,
commands: ['npm ci', 'npm run build', 'npx cdk synth', 'npx cdk ls'],
additionalInputs: { 'lambda': lambdaBuildStep.addOutputDirectory('lambda') }
})
});
}
}
Now, what I would like to achieve is to be able to assign the built code (so the output of the lambdaBuildStep
) to a Lambda function I have in another stack i.e. in the code snippet below, I want assign_output_CodeBuildStep
to be the s3 location where lambdaBuildStep
has built my function:
import * as cdk from 'aws-cdk-lib';
import * as lambda from 'aws-cdk-lib/aws-lambda';
import { Construct } from 'constructs';
import { Bucket } from 'aws-cdk-lib/aws-s3'
import {backendBuildStep} from 'cdk-pipeline-stack';
import {ArtifactMap} from 'aws-cdk-lib/pipelines';
export class LambdaMonolith extends cdk.Stack {
constructor(scope: Construct, id: string) {
super(scope, id);
const s3Code = assign_output_CodeBuildStep
const LambdaFunction = new lambda.Function(this, 'MyFunction', {
code: lambdaCode,
runtime: lambda.Runtime.PYTHON_3_8,
handler: 'lambda.handler',
});
}
};
and I can't seem to find a way to transfer that information from the pipeline stack to the Lambda stack.
Any help would be greatly appreciated.
Thanks!
Upvotes: 3
Views: 1585
Reputation: 1198
So I found a solution. Maybe not the solution. Indeed, this seems quite convoluted ... and I am sure there is a better way.
So the solution lies in the fact that the ShellStep in the CodePipeline construct attaches the output of additionalInputs
(so the result of the previous CodeBuildStep i.e. lambdaBuildStep
) in a specific directory which is dynamically generated but stored in an environment variable called CODEBUILD_SRC_DIR_BuildLambda_lambda_repo
so you can see it's a combination of the name of the CodeBuildStep and the repo (with the dash changed to underscores).
So my solution was to use this environment variables as my Lambda code asset.
const LambdaFunction = new lambda.Function(this, 'MyFunction', {
code: lambda.Code.fromAsset(process.env.CODEBUILD_SRC_DIR_BuildLambda_lambda_repo || ""),
runtime: lambda.Runtime.PYTHON_3_8,
handler: 'lambda.handler',
});
Upvotes: 1