pythonic
pythonic

Reputation: 21675

How to get CPU usage profile of a yarn based Spark application

I am running an Apache Spark application using yarn, on a hadoop cluster. After the program is finished, is there a way I can check the profile of CPU usage of that program. Basically, I want a profiling log at intervals of say 1 or 2 seconds.

Upvotes: 0

Views: 570

Answers (1)

RojoSam
RojoSam

Reputation: 1496

You can use the ResourceManager rest API´s

https://hadoop.apache.org/docs/stable/hadoop-yarn/hadoop-yarn-site/ResourceManagerRest.html#Cluster_Applications_API

Basically you require to implement a REST client that query ResourManager each 1 or 2 seconds and create your own logs at runtime.

Upvotes: 1

Related Questions