Reputation: 95
I have a slightly complex setup: I run my Airflow (v1.10.13) pipelines in local time (setup on the VM's timezone). The following DAG was marked for the Monday run as successful, but the task within was never scheduled (thus having no logs whatsoever). I had some issues with the Airflow scheduler and usage of non-UTC timezones in the past, so I wonder, if that could be another reason?
from airflow import DAG
from datetime import timedelta
from somewhere import get_localized_yesterday
import prepered_tasks as t
default_args = {
'owner': 'airflow',
'depends_on_past': False,
'start_date': get_localized_yesterday(),
'email': [],
'email_on_failure': True,
'email_on_retry': False,
'retry_delay': timedelta(minutes=1)
}
# Schedule the DAG daily at 2 a.m.
dag = DAG(
'descriptive_DAG_name',
default_args=default_args,
description='',
schedule_interval='0 2 * * Mon-Fri',
tags=['PROD']
)
single_task = t.task_partial(dag=dag)
single_task
The 'task_partial' is a task object embedded in a partial, so I only need to provide the dag for instantiation. This works as intended in other pipelines, which work properly.
I checked the usual suspects:
Upvotes: 1
Views: 1296
Reputation: 2408
There was a bug in airflow 1.10.13 and the release was yanked.
You should upgrade to 1.10.14.
Quote from issue:
After performing an upgrade to v1.10.13 we noticed that tasks in some of our DAGs were not be scheduled. After a bit of investigation we discovered that by commenting out 'depends_on_past': True the issue went away.
Upvotes: 1