Reputation: 1147
I have a job that parses many urls. if each task processes a single url, then the task will fail if the parsing of a single url throws an exception. in the regular hadoop behaivour, this task will be reattempted 3 times, and after that the job will fail. can I somehow intervene in case of task failure and write my own code that will disregard the task?
Upvotes: 0
Views: 107
Reputation: 460
Why don't you catch the exception?
Anyway, you can set maxMapTaskFailuresPercent and maxReduceTaskFailuresPercent suitably.
Upvotes: 1