Reputation: 1103
I am trying to set environment variable for my spark application, running in local mode.
Here is the spark-submit job:-
spark-submit --conf spark.executorEnv.FOO=bar --class com.amazon.Main SWALiveOrderModelSpark-1.0-super.jar
However, when I am trying to access this:-
System.out.println("env variable:- " + System.getenv("FOO"));
the output is:-
env variable:- null
Does anyone know how I can resolve this?
Upvotes: 4
Views: 6874
Reputation: 330153
spark.executorEnv.[EnvironmentVariableName]
is used to (emphasis mine):
Add the environment variable specified by EnvironmentVariableName to the Executor process.
It won't be visible on the driver, excluding org.apache.spark.SparkConf
. To access it using System.getenv
you have do it in the right context, for example from a task:
sc.range(0, 1).map(_ => System.getenv("FOO")).collect.foreach(println)
bar
Upvotes: 5
Reputation: 10092
You are setting a Spark environment variable using SparkConf
. You'll have to use SparkConf
to fetch it as well
sc.getConf.get("spark.executorEnv.FOO")
Upvotes: 1