Reputation: 1926
Folks, pretty new to Airflow. I am trying to send start_date
and end_date
to my sql script which is going to run as a task in my DAG. My initial approach was to template out these and send them to the PostgresOperator
via param
variable. Something like
PostgresOperator(
task_id='test_edw_job',
sql='sql/my.sql',
params={'start_date': start_date, 'end_date': end_date}
)
where start_date
and end_date
are macros defined as
end_date = "{{ macros.my_plugin.end_date(execution_date) }}"
start_of_month = '{{ macros.my_plugin.start_date(execution_date) }}'
Inside my SQL file am accessing these variables as {{ param.start_date }}
and {{ params.end_date }}
respectively.
But once I start my DAG and look at the rendered task its templating out the them as {{ macros.my_plugin.start_date(execution_date) }}
and {{ macros.my_plugin.end_date(execution_date) }}
where as am trying to get the actual values of these macros be templated here. Am I doing something inherently wrong here ? Any inputs would be highly apprecaited.
Upvotes: 0
Views: 1575
Reputation: 1926
So, doing more digging into how templating works in Airflow. There is only one pass where the params
would be replaced by its values. In my case params
i.e. start_date
and end_date
are themselves templates which means they would be replace by their corresponding macro definitions (i.e {{ macros.my_plugin.start_date(execution_date) }}
) but not with the output of those macros
i.e. something like 2019-10-10
. On doing some trial runs I figured out that I don't really had to pass in start_date
and end_date
as params to the PostgresOperator
. I could just template out start_date
and end_date
in my Sql file and as part of initial pass of templating these variables will the replaced by their corresponding values as computed via macros
representing them.
Hopefully this will help someone in future who is as new to templating in airflow
as I am.
Upvotes: 2