parakeet
parakeet

Reputation: 465

Migrating existing BigQuery datasets to Terraform

Background:

I'm new to Terraform but an active BigQuery user. We have an existing BigQuery project that has > 100 datasets and we want to start using Terraform to control access (dataset level only, not table or column level) as manually assigning access using the UI is no longer scalable. Our plan is to implement Terraform on 5 datasets and over time add datasets so as to eventually include every dataset in the project.

Problem:

I'm struggling to figure out how to import all these datasets automatically before running terraform apply. My main.tf file contains 5 blocks like this:

resource "google_bigquery_dataset" "dataset_1" {
  dataset_id    = "dataset_1"
  location      = local.location
  project       = local.project_id

  access {
    ...
  }
}

Is there a way to run terraform import to automatically import all datasets listed in main.tf?

Notes:

Upvotes: 2

Views: 3266

Answers (1)

Andre Araujo
Andre Araujo

Reputation: 2400

Usually I run the command terraform plan to check the resources. So I write/fix the code in some *.tf and import the state.

For import the state for Bigquery dataset will be some of this options:

$ terraform import google_bigquery_dataset.default projects/{{project}}/datasets/{{dataset_id}}
$ terraform import google_bigquery_dataset.default {{project}}/{{dataset_id}}
$ terraform import google_bigquery_dataset.default {{dataset_id}}

After that you will recive a message like that:

Import successful!

The resources that were imported are shown above. These resources are now in
your Terraform state and will henceforth be managed by Terraform.

Upvotes: 2

Related Questions