Reputation: 714
I currently have over 100 DAGs running in production. I am aware of how to add alerting with on_failure_callback
and with operators triggered by upstream failures, but is there a way to configure Airflow itself to always send an email when a DAG fails without having to go through and update every one of my DAGs to alert on failure individually?
Upvotes: 0
Views: 2393
Reputation: 6538
Not as far as I know, but I have this helper to handle my global/default dag/operator settings:
def on_failure_callback(context):
...
def on_success_callback(context):
...
def build_default_args(**kwargs):
default_args = {
'on_failure_callback': on_failure_callback,
'on_success_callback': on_success_callback,
'owner': 'me',
'queue': 'default',
'execution_timeout': timedelta(hours=1),
'retries': 3,
'retry_delay': timedelta(seconds=10),
}
default_args.update(kwargs)
return default_args
Then in each DAG:
dag = DAG(
dag_id='my_dag',
default_args=build_default_args(
start_date=datetime(2017, 9, 20),
execution_timeout=timedelta(hours=8), # overrides default
),
schedule_interval='@hourly',
)
Alternatively some custom base DAG
class...but either way you would still have to go back and change your 100+ DAGs once.
Upvotes: 6