parakeet
parakeet

Reputation: 465

Running a Terraform command via Airflow operator

I'm running Apache Airflow on Cloud Composer (composer-1.14.2-airflow-1.10.14). I want to use Terraform to create infrastructure but I can't find any operators to do this. As a workaround I'm using BashOperator like this:

create_vm=BashOperator(
    task_id='create_cluster',
    bash_command=f'''
        sudo apt-get update -y && \
        sudo apt-get install software-properties-common -y && \
        sudo apt-get update -y && \
        curl -fsSL https://apt.releases.hashicorp.com/gpg | sudo apt-key add - && \
        sudo apt-add-repository "deb [arch=amd64] https://apt.releases.hashicorp.com $(lsb_release -cs) main" && \
        sudo apt-get update && sudo apt-get install terraform && \
        cd /home/airflow/gcs/.../ && \
        terraform init && \
        terraform plan -out /home/airflow/gcs/.../plantf && \
        terraform init && \
        terraform apply /home/airflow/gcs/.../plantf
    ''',

    dag=dag)

This really doesn't feel like best practice. Is there a recommended way to run Terraform commands via an Airflow operator?

Upvotes: 5

Views: 3375

Answers (1)

The Marmoset King
The Marmoset King

Reputation: 41

After much wailing and gnashing of teeth, I was able to accomplish what you are trying to do. I was using Docker, but hopefully the same principles can be applied.

  1. I wrote a shell script to override the Airflow worker entrypoint so I could install and initialize Terraform.
#!/bin/bash

terraform_version="0.14.8"
terraform_installer_zipfile="terraform_${terraform_version}_linux_amd64.zip"

curl -O -L https://releases.hashicorp.com/terraform/${terraform_version}/${terraform_installer_zipfile}
python3 unzip-terraform.py ${terraform_installer_zipfile}

chmod +x terraform

./terraform init

/entrypoint ${1} ${2}
  1. The official Airflow image had limits on permission, and I couldn't install unzip. So I used a python script instead. (unzip-terraform.py in the script from #1)
import sys
import zipfile

with zipfile.ZipFile(sys.argv[1], 'r') as terraformInstaller:
    terraformInstaller.extractall()
  1. I wrote two more shell scripts to be used by the BashOperators in step 4. (I'm using it to set up GCP Compute resources).
#!/bin/bash

cd /opt/airflow

chmod +x terraform

./terraform plan -out=gcp-test

./terraform apply "./gcp-test"
#!/bin/bash

cd /opt/airflow

chmod +x terraform

./terraform destroy -auto-approve
  1. I used BashOperators to stand up/tear down the Compute resources.
terraform_build_task = BashOperator(
    task_id = 'terraform_build_task',
    dag = dag,
    bash_command =  '/opt/airflow/terraform-plan-apply.sh '
)
terraform_destroy_task = BashOperator(
    task_id = 'terraform_destroy_task',
    dag = dag,
    bash_command =  '/opt/airflow/terraform-plan-destroy.sh '
)

It works quite well. Also, I should point out that the trailing space in the bash_command values is intentional. Without it, Airflow will try to apply a Jinja template and it will fail.

Upvotes: 4

Related Questions