mahendra maid
mahendra maid

Reputation: 467

ERROR : User did not initialize spark context

Log error :

TestSuccessfull
2018-08-20 04:52:15 INFO ApplicationMaster:54 - Final app status: FAILED, exitCode: 13 2018-08-20 04:52:15 ERROR ApplicationMaster:91 - Uncaught exception: java.lang.IllegalStateException: User did not initialize spark context! at org.apache.spark.deploy.yarn.ApplicationMaster.runDriver(ApplicationMaster.scala:498) at org.apache.spark.deploy.yarn.ApplicationMaster.org$apache$spark$deploy$yarn$ApplicationMaster$$runImpl(ApplicationMaster.scala:345) at org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$run$2.apply$mcV$sp(ApplicationMaster.scala:260) at org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$run$2.apply(ApplicationMaster.scala:260) at org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$run$2.apply(ApplicationMaster.scala:260) at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$5.run(ApplicationMaster.scala:800) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698) at org.apache.spark.deploy.yarn.ApplicationMaster.doAsUser(ApplicationMaster.scala:799) at org.apache.spark.deploy.yarn.ApplicationMaster.run(ApplicationMaster.scala:259) at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:824) at org.apache.spark.deploy.yarn.ApplicationMaster.main(ApplicationMaster.scala) 2018-08-20 04:52:15 INFO SparkContext:54 - Invoking stop() from shutdown hook

Error log on console After submit command :

2018-08-20 05:47:35 INFO Client:54 - Application report for application_1534690018301_0035 (state: ACCEPTED)
2018-08-20 05:47:36 INFO Client:54 - Application report for application_1534690018301_0035 (state: ACCEPTED)
2018-08-20 05:47:37 INFO Client:54 - Application report for application_1534690018301_0035 (state: FAILED)
2018-08-20 05:47:37 INFO Client:54 - client token: N/A diagnostics: Application application_1534690018301_0035 failed 2 times due to AM Container for appattempt_1534690018301_0035_000002 exited with exitCode: 13 Failing this attempt.Diagnostics: [2018-08-20 05:47:36.454]Exception from container-launch. Container id: container_1534690018301_0035_02_000001 Exit code: 13

My code :

val sparkConf = new SparkConf().setAppName("Gathering Data")            
val sc = new SparkContext(sparkConf)

submit command :

spark-submit --class spark_basic.Test_Local --master yarn --deploy-mode cluster /home/IdeaProjects/target/Spark-1.0-SNAPSHOT.jar

discription :

I have installed spark on hadoop in psedo distribustion mode.

spark-shell working fine. only problem when i used cluster mode .

My code also work file . i am able print output but at final its giving error .

Upvotes: 5

Views: 30403

Answers (3)

Harsha TJ
Harsha TJ

Reputation: 272

I presume your lines of code has a line which sets master to local.

SparkConf.setMaster("local[*]")

if so, try to comment out that line and try again as you will be setting the master to yarn in your command

/usr/cdh/current/spark-client/bin/spark-submit --class com.test.sparkApp --master yarn --deploy-mode cluster --num-executors 40 --executor-cores 4 --driver-memory 17g --executor-memory 22g --files /usr/cdh/current/spark-client/conf/hive-site.xml /home/user/sparkApp.jar

Upvotes: 12

Artyom Rebrov
Artyom Rebrov

Reputation: 691

This error may occur if you are submitting the spark job like this: spark-submit --class some.path.com.Main --master yarn --deploy-mode cluster some_spark.jar (with passing master and deploy-mode as argument in CLI) and at the same time having this line: new SparkContext in your code.

Either get the context with val sc = SparkContext.getOrCreate() or do not pass the spark-submit master and deploy-mode arguments if want to have new SparkContext.

Upvotes: -1

mahendra maid
mahendra maid

Reputation: 467

Finally i got with

spark-submit

/home/mahendra/Marvaland/SparkEcho/spark-2.3.0-bin-hadoop2.7/bin/spark-submit --master yarn --class spark_basic.Test_Local /home/mahendra/IdeaProjects/SparkTraining/target/SparkTraining-1.0-SNAPSHOT.jar

spark session

val spark = SparkSession.builder()
    .appName("DataETL")
    .master("local[1]")
    .enableHiveSupport()
    .getOrCreate()

thanks @cricket_007

Upvotes: 2

Related Questions