fo_x86
fo_x86

Reputation: 2623

Access SparkConf from the worker

Is there a way to get spark configurations from the worker (i.e. inside the closure of a map function). I tried using

SparkEnv.get().conf()

but it seems to not contain all the custom spark configs that I've set prior to creating SparkContext

EDIT:

Through SparkEnv I'm able to get default configurations set via spark-defaults.config but all confs I set explicitly through the setter method

SparkConf conf = new SparkConf()
conf.set("my.configuration.key", "myConfigValue")
SparkContext sc = new SparkContext(conf)

are not present in the SparkConf object I get through SparkEnv.get().conf()

Upvotes: 2

Views: 1010

Answers (1)

zero323
zero323

Reputation: 330413

SparkEnv is a part of the developer API and is not intended for external use.

You can simply create a broadcast variable, though.

val confBd = sc.broadcast(sc.getConf.getAll.toMap)
rdd.foreachPartition(_ => println(confBd.value.get("spark.driver.host")))

Upvotes: 1

Related Questions