Sanchay
Sanchay

Reputation: 1103

Unable to set Environment Variables in Spark Application

I am trying to set environment variable for my spark application, running in local mode.

Here is the spark-submit job:-

spark-submit --conf spark.executorEnv.FOO=bar --class com.amazon.Main SWALiveOrderModelSpark-1.0-super.jar

However, when I am trying to access this:-

System.out.println("env variable:- " + System.getenv("FOO"));

the output is:-

env variable:- null

Does anyone know how I can resolve this?

Upvotes: 4

Views: 6874

Answers (2)

zero323
zero323

Reputation: 330153

spark.executorEnv.[EnvironmentVariableName] is used to (emphasis mine):

Add the environment variable specified by EnvironmentVariableName to the Executor process.

It won't be visible on the driver, excluding org.apache.spark.SparkConf. To access it using System.getenv you have do it in the right context, for example from a task:

sc.range(0, 1).map(_ => System.getenv("FOO")).collect.foreach(println)
bar

Upvotes: 5

philantrovert
philantrovert

Reputation: 10092

You are setting a Spark environment variable using SparkConf. You'll have to use SparkConf to fetch it as well

sc.getConf.get("spark.executorEnv.FOO")

Upvotes: 1

Related Questions