Illusionist
Illusionist

Reputation: 5489

Using terraform remote state in s3 with multiple folders

I am currently using the default workspace and my folder structure is like this -

dev
        ├── app
        │   └── main.tf
        ├── mysql
        │   └── main.tf
        └── vpc
            └── main.tf

I have a s3 backend created and it works fine for a single folder

terraform {
  backend "s3" {
    bucket         = "mybucket"
    key            = "global/s3/mykey/terraform.tfstate"
    region         = "us-east-1"
    dynamodb_table = "terraform-state-wellness-nonprod"
    encrypt        = true
  }
}

I am struggling on how to include this back-end config in all the folders, like i want to use the same backend s3 bucket in app, mysql and vpc (diff keys for Dynamodb) but while this works in one folder , in the second folder terraform wants to delete both S3 bucket and Dynamodb.

Upvotes: 6

Views: 3780

Answers (1)

GNOKOHEAT
GNOKOHEAT

Reputation: 963

I recommend you use module structure in terraform code.

like :

   dev
    ├──modules
    │    ├── app
    │    │   └── app.tf
    │    ├── mysql
    │    │   └── mysql.tf
    │    └── vpc
    │        └── vpc.tf
    └──main.tf

main.tf :

module "app" {
  source = "./modules/app"
...
}

module "mysql" {
  source = "./modules/mysql"
...
}

module "vpc" {
  source = "./modules/vpc"
...
}

terraform {
  backend "s3" {
    ...
  }
}

If you want to apply/destroy each module :

terraform apply -target module.app
terraform destroy -target module.app

See :

Here's a repository using module structure.

Upvotes: 9

Related Questions