Reputation: 440
When multiple 'KubernetesPodOperator' tasks are defined in an Airflow DAG, all the tasks gets executed in parallel.
In order to achieve sequential execution, dependencies can be defined, say task1 >> task2 >> task3
etc.
Problem with this approach is on failure scenario, task1
alone can't be re-executed, dependency tasks will get executed on task1
completion.
How to execute tasks in sequence without a dependency? I don't want to modify Airflow config, settings specific to an Airflow DAG/Task is required. Hope that is supported.
Upvotes: 0
Views: 889
Reputation: 5096
Sequential execution needs dependencies definition, but if you want to limit the number of running tasks in your dag, and run only a task at once regardless the order of execution, you can set the dag concurrency
argument to 1, or use a pool of size 1 in all the dag tasks, but in the two options, you will have a single task running on all the runs.
If your problem with dependencies is just the need to clear a task state without/with clearing the state of the downstream, you can use the UI clear with/without this option:
Upvotes: 1