Reputation: 460
After I installed the ganglia, the web UI just show the basic metric information about disk, as follows:
But in the ganglia demo website, please see here, it show many metrics about disk iostat, as follows: There is my question: How should I config ganglia to show these metrics?
I know there are many ganglia module on the github, but I don't konw how to use it. I'm a newbie to ganglia, can you tell me how should I do? Thank you very much.
Upvotes: 1
Views: 589
Reputation: 6693
2 Steps to show Spark metrics in ganglia:
Rebuild spark with ganglia support
Spark pre-release doesn't come with ganglia support for license issue, Spark's Apache 2.0 and ganglia's LGPL.
Normally, build/mvn -Pspark-ganglia-lgpl -Pother_profiles_to_enable -DskipTests clean package
should be OK to rebuild one yourself, you can find more info on customizing spark build here
Set up metrics using conf/metrics.properties
I would suggest reading Monitoring and Instrumentation and metrics configuration template first.
The metrics system is divided into instances which correspond to internal components. Each instance can be configured to report its metrics to one or more sinks. Accepted values for [instance] are "master", "worker", "executor", "driver", and "applications".
A "sink" specifies where metrics are delivered to. Each instance can be assigned one or more sinks.
ganglia is one of the sinks, and you could configure it as follows:
*.sink.ganglia.class=org.apache.spark.metrics.sink.GangliaSink
*.sink.ganglia.host=239.2.11.71
*.sink.ganglia.port=8636
*.sink.ganglia.period=10
*.sink.ganglia.unit=seconds
*.sink.ganglia.ttl=1
*.sink.ganglia.mode=multicast
Upvotes: 1