Reputation: 765
I have a folder queries
where a user will add, delete and modify yaml
files. Each yaml file represent a single terraform resource on GCP, a Scheduling Query.
What would be the cleanest way to loop over the queries folder and to generate the appropriate number of terraform resource in main main.tf
accordingly? I can use Python to generate the main.tf
if it is easier
Example for 1 signel ressource:
queries/alpha.yaml
display_name: "my-query"
data_source_id: "scheduled_query"
schedule: "first sunday of quarter 00:00"
destination_dataset_id: "results"
destination_table_name_template: "my_table"
write_disposition: "WRITE_APPEND"
query: "SELECT name FROM tabl WHERE x = 'y'"
This should create this ressource in my main.tf
resource "google_bigquery_data_transfer_config" "query_config" {
display_name = "my-query"
data_source_id = "scheduled_query"
schedule = "first sunday of quarter 00:00"
destination_dataset_id = "results"
params = {
destination_table_name_template = "my_table"
write_disposition = "WRITE_APPEND"
query = "SELECT name FROM tabl WHERE x = 'y'"
}
}
Upvotes: 2
Views: 2574
Reputation: 238249
You can read all the files in your locals
:
locals {
query_files = fileset(path.module, "queries/*.yaml")
queries = {for query_file in local.query_files:
query_file => yamldecode(file(query_file))}
}
then use for_each
to create your resource:
resource "google_bigquery_data_transfer_config" "query_config" {
for_each = local.queries
display_name = each.value.display_name
data_source_id = each.value.data_source_id
schedule = each.value.schedule
destination_dataset_id = each.value.destination_dataset_id
params = {
destination_table_name_template = each.value.destination_table_name_template
write_disposition = each.value.write_disposition
query = each.value.query
}
}
Upvotes: 7