JChao
JChao

Reputation: 2309

Apache Airflow -- Need to always run a process in a dag whether previous process succeeds or not

This usually applies to shutting down some machines.

Say I start an AWS EC2/EMR instance to do some work, and I want to shut it down to avoid having it sitting there.

I would normally do

start instance -> do my work1 -> do my work 2 -> shut down instance

but let's say do my work 2 fails, and the shut down instance will never be triggered. Is there a way to still trigger the shut down instance part of dag?

Upvotes: 1

Views: 710

Answers (1)

Viraj Parekh
Viraj Parekh

Reputation: 1381

You can change the task's TriggerRule. It sounds like you want the trigger rule to be all_done so that your last task will execute, regardless of the parent succeeding or failing.

Upvotes: 3

Related Questions