Bogdan Pastiu
Bogdan Pastiu

Reputation: 153

Terraform - Multiple aws_s3_bucket_notification triggers on the same bucket

I need to create a trigger for an S3 bucket. We use the following to create the trigger:

resource "aws_s3_bucket_notification" "bucket_notification" {
  bucket = var.aws_s3_bucket_id

  lambda_function {
    lambda_function_arn = var.lambda_function_arn
    events              = ["s3:ObjectCreated:Put"]
    filter_prefix       = var.filter_prefix
    filter_suffix       = var.filter_suffix
  }
}

This works fine when the bucket does not already have a trigger which was the case for all environments apart from production. When we deployed production we saw that the trigger which was already present on the bucket got deleted. We need both triggers. I was able to add another trigger manually, for example a PUT event trigger by just changing the prefix, however when I do it from Terraform the previous always gets deleted. Is there anything I am missing?

Upvotes: 15

Views: 12603

Answers (2)

D.Fitz
D.Fitz

Reputation: 531

@ydaetskcoR's answer helped to solve my issue. But I wanted a more DRY solution.

This could be further refined, but it works for now. If it gets more complicated, I will switch to a module from Terraform Registry.

locals {
  # List of trigger definitions
  triggers = [
    {
        type           = "s3"
        name           = start-trigger
        trigger_bucket = my-bucket
        filter_prefix  = null
        filter_sufix   = ".webm"
    },
    {
        type           = "s3"
        name           = finish-trigger
        trigger_bucket = my-bucket
        filter_prefix  = null
        filter_sufix   = ".json"
    }
  ]
}

# Get Trigger S3 Bucket(s)
data "aws_s3_bucket" "trigger_s3_bucket" {
  for_each = toset([ for trigger in var.triggers: trigger.trigger_bucket ])

  bucket = each.value
}

# Create Function
resource "aws_lambda_function" "this" {
...
}

# Create lambda_permission(s)
resource "aws_lambda_permission" "allow_bucket" {
  for_each = toset([ for trigger in var.triggers: trigger.trigger_bucket if trigger.type == "s3" ])

  statement_id  = "AllowExecutionFromS3Bucket"
  action        = "lambda:InvokeFunction"
  function_name = aws_lambda_function.this.arn
  principal     = "s3.amazonaws.com"
  source_arn    = data.aws_s3_bucket.trigger_s3_bucket[each.value].arn
}

# Create at least 1 notification
resource "aws_s3_bucket_notification" "bucket_notification" {
  count = length([for trigger in var.triggers : trigger if trigger.type == "s3"]) > 0 ? 1 : 0
  bucket = var.triggers[count.index].trigger_bucket

  dynamic "lambda_function" {
    for_each = { for trigger in var.triggers : trigger.name => trigger if trigger.type == "s3" }

    content {
      lambda_function_arn = aws_lambda_function.this.arn
      events = ["s3:ObjectCreated:*"]
      filter_prefix = lambda_function.value.filter_prefix
      filter_suffix = lambda_function.value.filter_suffix
    }
  }

  depends_on = [aws_lambda_permission.allow_bucket]
}

Upvotes: 0

ydaetskcoR
ydaetskcoR

Reputation: 56997

The aws_s3_bucket_notification resource documentation mentions this at the top:

NOTE: S3 Buckets only support a single notification configuration. Declaring multiple aws_s3_bucket_notification resources to the same S3 Bucket will cause a perpetual difference in configuration. See the example "Trigger multiple Lambda functions" for an option.

Their example shows how this should be done by adding multiple lambda_function blocks in the aws_s3_bucket_notification resource:

resource "aws_iam_role" "iam_for_lambda" {
  name = "iam_for_lambda"

  assume_role_policy = <<EOF
{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Action": "sts:AssumeRole",
      "Principal": {
        "Service": "lambda.amazonaws.com"
      },
      "Effect": "Allow"
    }
  ]
}
EOF
}

resource "aws_lambda_permission" "allow_bucket1" {
  statement_id  = "AllowExecutionFromS3Bucket1"
  action        = "lambda:InvokeFunction"
  function_name = "${aws_lambda_function.func1.arn}"
  principal     = "s3.amazonaws.com"
  source_arn    = "${aws_s3_bucket.bucket.arn}"
}

resource "aws_lambda_function" "func1" {
  filename      = "your-function1.zip"
  function_name = "example_lambda_name1"
  role          = "${aws_iam_role.iam_for_lambda.arn}"
  handler       = "exports.example"
  runtime       = "go1.x"
}

resource "aws_lambda_permission" "allow_bucket2" {
  statement_id  = "AllowExecutionFromS3Bucket2"
  action        = "lambda:InvokeFunction"
  function_name = "${aws_lambda_function.func2.arn}"
  principal     = "s3.amazonaws.com"
  source_arn    = "${aws_s3_bucket.bucket.arn}"
}

resource "aws_lambda_function" "func2" {
  filename      = "your-function2.zip"
  function_name = "example_lambda_name2"
  role          = "${aws_iam_role.iam_for_lambda.arn}"
  handler       = "exports.example"
}

resource "aws_s3_bucket" "bucket" {
  bucket = "your_bucket_name"
}

resource "aws_s3_bucket_notification" "bucket_notification" {
  bucket = "${aws_s3_bucket.bucket.id}"

  lambda_function {
    lambda_function_arn = "${aws_lambda_function.func1.arn}"
    events              = ["s3:ObjectCreated:*"]
    filter_prefix       = "AWSLogs/"
    filter_suffix       = ".log"
  }

  lambda_function {
    lambda_function_arn = "${aws_lambda_function.func2.arn}"
    events              = ["s3:ObjectCreated:*"]
    filter_prefix       = "OtherLogs/"
    filter_suffix       = ".log"
  }
}

Upvotes: 15

Related Questions