Reputation: 1493
I have a DAG with three bash tasks which is scheduled to run every day.
I would like to access unique ID of dag instance(may be PID) in all bash scripts.
Is there any way to do this?
I am looking for similar functionality as Oozie where we can access WORKFLOW_ID in workflow xml or java code.
Can somebody point me to documentation of AirFlow on "How to use in-build and custom variables in AirFlow DAG"
Many Thanks Pari
Upvotes: 3
Views: 4479
Reputation: 560
Object's attributes can be accessed with dot notation in jinja2 (see https://airflow.apache.org/code.html#macros). In this case, it would simply be:
{{ dag.dag_id }}
Upvotes: 2
Reputation: 6762
i made use of the fact that the python object for dag
prints out the name of the current dag. so i just use jinja2 to change the dag
name:
{{ dag | replace( '<DAG: ', '' ) | replace( '>', '' ) }}
bit of a hack, but it works.
therefore,
clear_upstream = BashOperator( task_id='clear_upstream',
trigger_rule='all_failed',
bash_command="""
echo airflow clear -t upstream_task -c -d -s {{ ts }} -e {{ ts }} {{ dag | replace( '<DAG: ', '' ) | replace( '>', '' ) }}
"""
)
Upvotes: 0