TravelingLex
TravelingLex

Reputation: 477

Iterate through map for file function

I have a usecase for uploading multiple files to s3 using terraform. I would like to upload multiple objects using the count function. In doing this need to iterate through the source with file("${path.module}/path/to/file").

Is there anyway to make the file function a mapped variable leveraging the count.index?

Upvotes: 7

Views: 17643

Answers (1)

Martin Atkins
Martin Atkins

Reputation: 74779

Terraform 0.12.8 introduced a new function fileset which can return a set of file paths matching a particular pattern in a particular base directory.

We can combine that with resource for_each (rather than count) to upload the matching files to S3, like this:

resource "aws_s3_object" "example" {
  for_each = fileset("${path.module}/files", "*") # could use ** instead for a recursive search

  bucket       = "example"
  key          = each.value
  source       = "${path.module}/${each.value}"

  # Unless the bucket has encryption enabled, the ETag of each object is an
  # MD5 hash of that object.
  etag = filemd5("${path.module}/${each.value}")
}

Using for_each instead of count here means that Terraform will identify each instance of the resource by its S3 path rather than by its position in a list, and so you can add and remove files without disturbing other files. For example, if you have a file called example.txt then Terraform will track its instance as aws_s3_object.example["example.txt"], rather than an address like aws_s3_object.example[3] where 3 is its position in the list of files.


I have written a Terraform module that builds on fileset to also support template rendering and detecting filetypes based on filename suffixes, which might make life easier in some more complicated situations: apparentlymart/dir/template. You can use its result with aws_s3_object in a similar way to the above, as shown in its README.

Upvotes: 17

Related Questions