Priyanka
Priyanka

Reputation: 271

How to specify which java version to use in spark-submit command?

I want to run a spark streaming application on a yarn cluster on a remote server. The default java version is 1.7 but i want to use 1.8 for my application which is also there in the server but is not the default. Is there a way to specify through spark-submit the location of java 1.8 so that i do not get major.minor error ?

Upvotes: 18

Views: 42954

Answers (5)

Avinash Ganta
Avinash Ganta

Reputation: 163

The Java version would need to be set for both the Spark App Master and the Spark Executors which will be launched on YARN. Thus the spark-submit command must include two JAVA_HOME settings: spark.executorEnv.JAVA_HOME and spark.yarn.appMasterEnv.JAVA_HOME

spark-submit --class com.example.DataFrameExample --conf "spark.executorEnv.JAVA_HOME=/jdk/jdk1.8.0_162" --conf "spark.yarn.appMasterEnv.JAVA_HOME=/jdk/jdk1.8.0_162" --master yarn --deploy-mode client /spark/programs/DataFrameExample/target/scala-2.12/dfexample_2.12-1.0.jar

Upvotes: 2

Masterbuilder
Masterbuilder

Reputation: 509

If you want to set java environment for spark on yarn, you can set it before spark-submit

--conf spark.yarn.appMasterEnv.JAVA_HOME=/usr/java/jdk1.8.0_121 \

Upvotes: 3

Carlos Gomez
Carlos Gomez

Reputation: 230

Add JAVA_HOME that you want in spark-env.sh (sudo find -name spark-env.sh ...ej. : /etc/spark2/conf.cloudera.spark2_on_yarn/spark-env.sh)

Upvotes: 1

mathieu
mathieu

Reputation: 2438

JAVA_HOME was not enough in our case, the driver was running in java 8, but I discovered later that Spark workers in YARN were launched using java 7 (hadoop nodes have both java version installed).

I had to add spark.executorEnv.JAVA_HOME=/usr/java/<version available in workers> in spark-defaults.conf. Note that you can provide it in command line with --conf.

See http://spark.apache.org/docs/latest/configuration.html#runtime-environment

Upvotes: 16

Radu
Radu

Reputation: 2060

Although you can force the Driver code to run on a particular Java version (export JAVA_HOME=/path/to/jre/ && spark-submit ... ), the workers will execute the code with the default Java version from the yarn user's PATH from the worker machine.

What you can do is set each Spark instance to use a particular JAVA_HOME by editing the spark-env.sh files (documentation).

Upvotes: 4

Related Questions