Reputation: 169
I am trying to use Terraform to spin up a lambda function that uses source code in a github release package. The location of the package is:
https://github.com/DataDog/datadog-serverless-functions/releases
This will allow me to manually create the AWS DataDog forwarder without using their Cloudformation template (we want to control as much of the process as possible).
I'm not entirely sure how to pull down that zip file for lambda functions to use
resource "aws_lambda_function" "test_lambda" {
filename = "lambda_function_payload.zip"
function_name = "datadog-forwarder"
role = aws_iam_role.datadog_forwarder_role.arn
source_code_hash = filebase64sha256("lambda_function_payload.zip")
runtime = "python3.7"
environment {
variables = {
DD_API_KEY_SECRET_ARN = aws_secretsmanager_secret_version.dd_api_key.arn
#This stops the Forwarder from generating enhanced metrics itself, but it will still forward custom metrics from other lambdas.
DD_ENHANCED_METRICS = false
DD_S3_BUCKET_NAME = aws_s3_bucket.datadog_forwarder.name
}
}
}
I know that the source_code_hash
file name will change and the filename of the lambda function will change as well. Any help would be appreciated.
Upvotes: 3
Views: 1647
Reputation: 2470
Here is an example of configuration to achieve this using the terraform_data
resource and the local-exec
provisioner:
locals {
remote_file_url = "https://github.com/path/to/file.zip"
# make sure you download the file to a directory
# where the Terraform process has write permissions
local_file_path = "/tmp/file.zip"
}
resource "terraform_data" "download_lambda_zip_file" {
input = {
remote_file_url = local.remote_file_url
local_file_path = local.local_file_path
}
# This is to make sure the resource is recreated
# each time. Adapt to your use case
triggers_replace = [
timestamp()
]
provisioner "local-exec" {
# using "curl -Lo" to follow redirects as it didn't work otherwise
command = "curl -Lo ${self.input.local_file_path} ${self.input.remote_file_url}"
}
provisioner "local-exec" {
when = destroy
command = "rm -f ${self.input.local_file_path}"
}
}
Upvotes: 1
Reputation: 11
There is a way to specify a zip file for an AWS Lambda. Checkout the example configuration in https://github.com/hashicorp/terraform-provider-aws/blob/main/examples/lambda.
It uses a data source of type archive_file
data "archive_file" "zip" {
type = "zip"
source_file = "hello_lambda.py"
output_path = "hello_lambda.zip"
}
to set the filename and source_code_hash for the aws_lambda_function
resource:
resource "aws_lambda_function" "lambda" {
function_name = "hello_lambda"
filename = data.archive_file.zip.output_path
source_code_hash = data.archive_file.zip.output_base64sha256
.....
}
See the example files for complete details.
The Terraform AWS provider is calling the CreateFunction API ( https://docs.aws.amazon.com/lambda/latest/dg/API_CreateFunction.html), which allows you to specify a zip file.
Upvotes: 1
Reputation: 238587
There is no build in functionality to download files from the internet in terraform. But you could relatively easily do that by using external data source. For that you would create a bash script that could use curl
to download your zip, open it up, inspect or do any processing you need. The source would also return data that you can use for creation of your function.
Alternative is to use null_resource with local-exec to curl
your zip file. But local-exec
is less versitile then using the external data source
.
Upvotes: 3