user3725190
user3725190

Reputation: 343

java.lang.ClassCastException: org.apache.hadoop.conf.Configuration cannot be cast to org.apache.hadoop.yarn.conf.YarnConfiguration

I am running a spark application using yarn in cloudera. Spark version: 2.1

I get the following error:

SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/data/yarn/nm/filecache/13/jars/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/opt/cloudera/parcels/CDH-5.10.2-1.cdh5.10.2.p0.5/jars/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] 18/04/14 22:20:57 INFO util.SignalUtils: Registered signal handler for TERM 18/04/14 22:20:57 INFO util.SignalUtils: Registered signal handler for HUP 18/04/14 22:20:57 INFO util.SignalUtils: Registered signal handler for INT Exception in thread "main" java.lang.ClassCastException: org.apache.hadoop.conf.Configuration cannot be cast to org.apache.hadoop.yarn.conf.YarnConfiguration at org.apache.spark.deploy.yarn.ApplicationMaster.(ApplicationMaster.scala:60) at org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$main$1.apply$mcV$sp(ApplicationMaster.scala:764) at org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:67) at org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:66) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1656) at org.apache.spark.deploy.SparkHadoopUtil.runAsSparkUser(SparkHadoopUtil.scala:66) at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:763) at org.apache.spark.deploy.yarn.ApplicationMaster.main(ApplicationMaster.scala)

Upvotes: 0

Views: 4213

Answers (2)

Damien Picard
Damien Picard

Reputation: 381

I encounter the same issue while trying to start a Spark job using Yarn Rest API. And the reason was that the environment variable SPARK_YARN_MODE was missing. Adding this env var, everything works fine :

export SPARK_YARN_MODE=true

Upvotes: 0

user3725190
user3725190

Reputation: 343

I managed to solve it by verifyning that the spark version configured in SPARK_HOME variable matches the hadoop version installed in cloudera. From the following link https://spark.apache.org/downloads.html you can download the suitable version for your required hadoop. The haddop version in cloudera can by found by:

$ hadoop version

Upvotes: 1

Related Questions