Reputation: 33
I am trying to get the spark default metrics from the application to statsd sink at Job level not cluster level. So I configured the necessary configuration in the Spark session in code. And in a local system, which means a single node, while it is running, I am able to receive all the UDP packets. While in the AWS DataBricks (SingleNode or MultiNode) cluster, I deployed the same jar. But there, I am not getting any metrics. But if I override the Spark config in Advance Options -> Compute in Databricks. I am able to receive metrics. And configurations are reflected in the Environment Variables.
I tried updating "metrics.properties" using InitScript in the databricks cluster. I am not able to see the values in Environment Variables in the Spark UI. And also overrode the spark session configuration again after the spark session creation. And I am printing the spark configuration values in the console. Still Not reflecting the environmental Variables.
Upvotes: 1
Views: 257