Reputation: 435
I have Airflow 1.10.2 installation with python 3.5.6.
Metadata is lying into Mysql database with LocalExecutor for execution.
I have created sample helloworld.py dag with below schedule.
default_args = {
'owner': 'Ashish',
'depends_on_past': False,
'start_date': datetime(2019, 2, 15),
'email_on_failure': False,
'email_on_retry': False,
'retries': 1,
'retry_delay': timedelta(minutes=1),
}
dag = DAG('Helloworld',schedule_interval='56 6 * * *', default_args=default_args)
But scheduler didn't pickup this dag as per scheduled time whereas when i run it manually from UI it runs perfectly fine.
Concern here is why does scheduler fails to pickup dag run as per the scheduled time.
Upvotes: 1
Views: 1210
Reputation: 2456
I think you are confused on start_date:
. Your current schedule is set to run at 6:56 AM UTC on 2/15/2019. With this schedule, the DAG will run tomorrow with no problem. This is because Airflow runs jobs at the end of an interval, not the beginning.
start_date:
is not when you want the DAG to be triggered, but when you want the scheduling interval to start. If you wanted your job to run today, start date should be: 'start_date': datetime(2019, 2, 14)
. Then your current daily scheduling interval would have ended at 6:56 AM today as intended and your DAG would have ran.
Taken from this answer.
Upvotes: 2