Mohitt
Mohitt

Reputation: 2977

Spark + Yarn: How to retain logs of lost-executors

Working with Spark configured with Yarn (in client mode, though not much relevant to the question), I found that some of my Executors are failing.

The Executor, which is a Yarn-Container, has its individual log file at: /var/log/hadoop-yarn/containers/containerID. Some of the (critical) events/logs generated by the container percolates to the driver, but not all of them. It is observed that when an Executor fails, its log file is cleared up as soon as it dies. Is there any way to keep these logs from getting deleted for debug purposes?

Upvotes: 3

Views: 1617

Answers (1)

Ramzy
Ramzy

Reputation: 7138

Since, you have spark on yarn, I hope this would help to gather all the logs

yarn logs -applicationId <application ID>

Upvotes: 1

Related Questions