Cassie
Cassie

Reputation: 3099

Change Hadoop version for Spark

How can I set a Hadoop version for the Spark application without submitting a jar and defining specific Hadoop binary? And is it even possible? I am just not really sure how can Hadoop version be changed while submitting Spark application.

Something like this does not work:

  val sparkSession = SparkSession
    .builder
    .master("local[*]")
    .appName("SparkJobHDFSApp")
    .getOrCreate()
  sparkSession.sparkContext.hadoopConfiguration.set("hadoop.common.configuration.version", "2.7.4")

Upvotes: 2

Views: 1290

Answers (1)

OneCricketeer
OneCricketeer

Reputation: 191701

It can't be. The Spark Master and Workers each have their own Hadoop JARs on the classpath with which your own application must be compatible with

Upvotes: 3

Related Questions