Reputation: 727
I'm trying to write our first Airflow DAG, and I'm getting the following error when I try to list the tasks using command airflow list_tasks orderwarehouse
:
Traceback (most recent call last):
File "/usr/local/lib/python2.7/site-packages/airflow/models.py", line 2038, in resolve_template_files
setattr(self, attr, env.loader.get_source(env, content)[0])
File "/usr/local/lib/python2.7/site-packages/jinja2/loaders.py", line 187, in get_source
raise TemplateNotFound(template)
TemplateNotFound: ./home/deploy/airflow-server/task_scripts/orderwarehouse/load_warehouse_tables.sh
This DAG is not supposed to use a template. I'm only trying to run the shell script in the specified location per the instructions in the docs. The shell script does exist in that location and is spelled correctly. My DAG looks like this:
from airflow import DAG
from airflow.operators.bash_operator import BashOperator
default_args = {
'owner': 'airflow',
'depends_on_past': False,
'start_date': datetime(2015, 6, 1),
'email': ['[email protected]'],
'email_on_failure': False,
'email_on_retry': False,
'retries': 1,
'retry_delay': timedelta(minutes=5),
# 'queue': 'bash_queue',
# 'pool': 'backfill',
# 'priority_weight': 10,
# 'end_date': datetime(2016, 1, 1),
}
orderwarehouse = DAG('orderwarehouse', default_args=default_args)
load_mysql = BashOperator(
task_id='load_warehouse_mysql',
bash_command='./home/deploy/airflow-server/task_scripts/orderwarehouse/load_warehouse_tables.sh',
dag=orderwarehouse)
Not sure why it thinks it needs to look for a Jinja template. Running out of ideas on this one, would appreciate if anyone can point me to where I'm going astray. Thanks.
Upvotes: 45
Views: 31458
Reputation: 1223
This is a pitfall of airflow. Add a space at the end of your bash_command and it should run fine
Source: https://cwiki.apache.org/confluence/pages/viewpage.action?pageId=62694614
Upvotes: 90
Reputation: 96
in addition to all the answers provided i had to do something more to get rid of the jinja template not found issue.
we have to add space after the file name in task definition.
Upvotes: 6
Reputation: 31
You should try with space at the end of filepath. whichever operator you are using you should always follow the same rule.
load_mysql = BashOperator(
task_id='load_warehouse_mysql',
command='/home/deploy/airflow-server/task_scripts/orderwarehouse/load_warehouse_tables.sh ',
dag=orderwarehouse)
Upvotes: 3