Kar
Kar

Reputation: 1016

Airflow sql_path not able to read the sql files when passed as Jinja Template Variable

I am trying to use Jinja template variable as against using Variable.get('sql_path'), So as to avoid hitting DB for every scan of the dag file

Original code

import datetime
import os
from functools import partial
from datetime import timedelta
from airflow.models import DAG,Variable
from airflow.contrib.operators.snowflake_operator import SnowflakeOperator
from alerts.email_operator import dag_failure_email

SNOWFLAKE_CONN_ID = 'etl_conn'

tmpl_search_path = []
for subdir in ['business/', 'audit/', 'business/transform/']:
    tmpl_search_path.append(os.path.join(Variable.get('sql_path'), subdir))


def get_db_dag(
    *,
    dag_id,
    start_date,
    schedule_interval,
    max_taskrun,
    max_dagrun,
    proc_nm,
    load_sql
):

    default_args = {
        'owner': 'airflow',
        'start_date': start_date,
        'provide_context': True,
        'execution_timeout': timedelta(minutes=max_taskrun),
        'retries': 0,
        'retry_delay': timedelta(minutes=3),
        'retry_exponential_backoff': True,
        'email_on_retry': False,
    }


    dag = DAG(
        dag_id=dag_id,
        schedule_interval=schedule_interval,
        dagrun_timeout=timedelta(hours=max_dagrun),
        template_searchpath=tmpl_search_path,
        default_args=default_args,
        max_active_runs=1,
        catchup='{{var.value.dag_catchup}}',
        on_failure_callback=alert_email_callback,
    )


    load_table = SnowflakeOperator(
        task_id='load_table',
        sql=load_sql,
        snowflake_conn_id=SNOWFLAKE_CONN_ID,
        autocommit=True,
        dag=dag,
    )

    load_vcc_svc_recon

    return dag

# ======== DAG DEFINITIONS #

edw_table_A = get_db_dag(
    dag_id='edw_table_A',
    start_date=datetime.datetime(2020, 5, 21),
    schedule_interval='0 5 * * *',
    max_taskrun=3,  # Minutes
    max_dagrun=1,  # Hours
    load_sql='recon/extract.sql',
)

When I have replaced Variable.get('sql_path') with Jinja Template '{{var.value.sql_path}}' as below and ran the Dag, it threw an error as below, it was not able to get the file from the subdirectory of the SQL folder

tmpl_search_path = []
for subdir in ['bus/', 'audit/', 'business/snflk/']:
    tmpl_search_path.append(os.path.join('{{var.value.sql_path}}', subdir))

Got below error as inja2.exceptions.TemplateNotFound: extract.sql

Upvotes: 0

Views: 1482

Answers (1)

SergiyKolesnikov
SergiyKolesnikov

Reputation: 7815

Templates are not rendered everywhere in a DAG script. Usually they are rendered in the templated parameters of Operators. So, unless you pass the elements of tmpl_search_path to some templated parameter {{var.value.sql_path}} will not be rendered.

The template_searchpath of DAG is not templated. That is why you cannot pass Jinja templates to it.

The options of which I can think are

  1. Hardcode the value of Variable.get('sql_path') in the pipeline script.
  2. Save the value of Variable.get('sql_path') in a configuration file and read it from there in the pipeline script.
  3. Move the Variable.get() call out of the for-loop. This will result in three times fewer requests to the database.

More info about templating in Airflow.

Upvotes: 1

Related Questions