Reputation: 103
I'm running Spark applications on YARN, when I kill the job using:
yarn application kill -applicationId application_XYZ
I can not go to Spark Job GUI of killed application form Hadoop GUI (ResourceManager). When I open Spark history server directly and try to display Incomplete application application logs it works. When job is completed (not killed) log can be displayed this way: Hadoop GUI -> Spark history server. I'm using YARN log aggregation service to aggregate logs. Aslo I can access application logs using:
yarn logs -applicationId application_XYZ
Have you experienced the same behaviour when you kill a Spark application? Is there anything wrong with killing application this way?
Upvotes: 2
Views: 1037
Reputation: 3222
There is nothing wrong in killing of the application like that.And yes, Hadoop UI does not show the output of the killed jobs but as you mentioned you can see it from the logs on the master.
Upvotes: 0