Reputation: 459
I am trying to use packaged dag with Celery Executor
, but scheduler and worker are not picking up the job. I have restarted the airflow webserver
and airflow scheduler
but still no success. I have even reset the DB with airflow resetdb
but still nothing.
I am getting the following messages:-
[INFO] Handling signal: ttou
[INFO] Worker exiting (pid: 31418)
[INFO] Handling signal: ttin
[INFO] Booting worker with pid: 32308
DAGs are not running manually or even picked by the scheduler.
My zip file has the following contents:
unzip alerting.zip
creating: airflow_utils/
inflating: airflow_utils/enums.py
inflating: airflow_utils/psql_alerting_dag.py
extracting: airflow_utils/__init__.py
inflating: airflow_utils/hive_alerting_dag.py
inflating: airflow_utils/alerting_utils.py
inflating: alerting_12hrs.py
inflating: alerting_15mins.py
inflating: alerting_3hrs.py
If I place all these files in dags folder instead of packaging them, airflow scheduler is able to schedule the dags.
What is that I am doing wrong with packaged dags?
Upvotes: 1
Views: 934
Reputation: 459
I was on Airflow 1.8.1 which had problems with loading dags from zips. This issue was fixed in 1.8.3. https://issues.apache.org/jira/browse/AIRFLOW-1357
Upvotes: 1