Reputation: 21675
I am running an Apache Spark application using yarn, on a hadoop cluster. After the program is finished, is there a way I can check the profile of CPU usage of that program. Basically, I want a profiling log at intervals of say 1 or 2 seconds.
Upvotes: 0
Views: 570
Reputation: 1496
You can use the ResourceManager rest API´s
Basically you require to implement a REST client that query ResourManager each 1 or 2 seconds and create your own logs at runtime.
Upvotes: 1