AKC
AKC

Reputation: 1023

Spark Streaming failed executor tasks

When i look at Jobs tab on spark UI, i can see Tasks status like 20/20/ (4 failed).

Does it mean there is data loss on failed tasks? Aren't those failed tasks moved to a diff executor?

enter image description here

Upvotes: 1

Views: 1131

Answers (1)

Glennie Helles Sindholt
Glennie Helles Sindholt

Reputation: 13154

While you should be wary of failing tasks (they are frequently an indicator of an underlying memory issue), you need not worry about data loss. The stages have been marked as successfully completed, so the tasks that failed were in fact (eventually) successfully processed.

Upvotes: 3

Related Questions