Reputation: 115
If I have for example a (multitask) Databricks job with 3 tasks in series and the second one fails - is there a way to start from the second task instead of running the whole pipeline again?
Upvotes: 5
Views: 5529
Reputation: 3463
As of this writing (2024), using the Jobs 2.1 API to configure a job allows you to specify the following task-level attributes:
Note that condition_tasks do not support retries.
Upvotes: 1
Reputation: 87259
Right now this is not possible, but if you refer to the Databrick's Q3 (2021) public roadmap, there were some items around improving multi-task jobs.
Update: September 2022. This functionality was released back in May 2022nd with name repair & rerun
Upvotes: 5
Reputation: 1730
If you are running Databricks on Azure it is possible via Azure Data Factory.
Upvotes: 0