TheSneak
TheSneak

Reputation: 510

Spark SQL How to Allow Some Tasks To Fail but Overall Job Still Succeed?

I have a Spark job where a small minority of the tasks keep failing, causing the whole job to then fail, and nothing gets outputted to the table where results are supposed to go. Is there a way to get Spark to tolerate a few failed tasks and still write the output from the successful ones? I don't actually need 100% of the data to get through, so I'm fine with a few tasks failing.

Upvotes: 0

Views: 223

Answers (1)

Ged
Ged

Reputation: 18108

No, that is not possible, and not part of the design of Spark. No is also an answer.

Upvotes: 1

Related Questions