dhinesh bala
dhinesh bala

Reputation: 33

SparkSession configuration at the Job level is not getting applied; Spark Cluster configuration is overriding it

I am trying to get the spark default metrics from the application to statsd sink at Job level not cluster level. So I configured the necessary configuration in the Spark session in code. And in a local system, which means a single node, while it is running, I am able to receive all the UDP packets. While in the AWS DataBricks (SingleNode or MultiNode) cluster, I deployed the same jar. But there, I am not getting any metrics. But if I override the Spark config in Advance Options -> Compute in Databricks. I am able to receive metrics. And configurations are reflected in the Environment Variables.

SparkSession configuration code] AWS Databricks -> Compute -> Advance Options -> Spark Config cluster

I tried updating "metrics.properties" using InitScript in the databricks cluster. I am not able to see the values in Environment Variables in the Spark UI. And also overrode the spark session configuration again after the spark session creation. And I am printing the spark configuration values in the console. Still Not reflecting the environmental Variables.Spark Config override after spark session

Upvotes: 1

Views: 257

Answers (0)

Related Questions