Reputation: 3499
If I am using a BigQueryOperator with a SQL Template, how could I pass an argument to the SQL?
File: .sql/query.sql
SELECT * FROM `dataset.{{ task_instance.variable_for_execution }}
File: dag.py
BigQueryOperator(
task_id='compare_tables',
sql='./sql/query.sql',
use_legacy_sql=False,
dag=dag,
)
Upvotes: 10
Views: 12146
Reputation: 18914
You can pass an argument in params
parameter which can be used in the templated field as follows:
BigQueryOperator(
task_id='',
sql='SELECT * FROM `dataset.{{ params.param1 }}',
params={
'param1': 'value1',
'param2': 'value2'
},
use_legacy_sql=False,
dag=dag
)
OR you can have the SQL separate in file:
File: ./sql/query.sql
SELECT * FROM `dataset.{{ params.param1 }}
params
parameter's input should be a dictionary. In general, any operator in Airflow can be passed this params
parameter.
Upvotes: 15