null
null

Reputation: 3517

spark could not find spark-class-launcher-output file when given --driver-class-path

I'm trying to get spark to play nice with aws. Working in a windows environment.

The NativeS3 classes are never found regardless of the options that I've tried. Currently if I use : spark-shell --packages com.amazonaws:aws-java-sdk-s3:1.10.38,com.amazonaws:aws-java-sdk-core:1.10.38,org.apache.hadoop:hadoop-aws:2.7.1 as my command then I will dwnload the files and can use s3 however tha feels hacky and downloading them every tie is not ideal.

With the help of another I've been trying other options which lead to :

>spark-shell --driver-class-path=C:\Spark\hadoop\share\hadoop\common\lib\hadoop-aws-2.7.1.jar;C:\Spark\hadoop\share\hadoop\common\lib\aws-java-sdk-1.7.4.jar" --verbose

there was an error with copying the file that didn't exist, I changed the path of temp just to cover privilege doubt but this error remains:

   C:\java_1.7\jdk1.7.0_79\bin\java -cp "C:\Spark\hadoop\share\hadoop\common\lib\hadoop-aws-2.7.1.jar;C:\Spark\hadoop\share\hadoop\common\lib\aws-java-sdk-1.7.4.jar --verbose > c:\temp\spark-class-launcher-output-4879.txt;C:\Spark\bin\..\conf
Xms1g -Xmx1g "-XX:MaxPermSize=256m" org.apache.spark.deploy.SparkSubmit --conf "spark.driver.extraClassPath=C:\Spark\hadoop\share\hadoop\common\lib\hadoop-aws-2.7.1.jar;C:\Spark\hadoop\share\hadoop\common\lib\aws-java-sdk-1.7.4.jar --verbo
The system cannot find the file c:\temp\spark-class-launcher-output-4879.txt.
Could Not Find c:\temp\spark-class-launcher-output-4879.txt

it's been pinpointed to this particular line in the spark-class2.cmd file but I do not know how to solve it.

https://github.com/apache/spark/blob/master/bin/spark-class2.cmd#L59

Can anyone shed any light at all? Thank you in advance

Upvotes: 2

Views: 2731

Answers (2)

Hasham
Hasham

Reputation: 11

I too had such issue for some good enough time. Later what I found was the issue with JAVA_HOME path misconfiguration. Once I configured to definite path, the issue was resolved.

Also, try to follow the instructions mentioned in the link: https://jaceklaskowski.gitbooks.io/mastering-apache-spark/content/spark-tips-and-tricks-running-spark-windows.html

Upvotes: 1

charles gomes
charles gomes

Reputation: 2155

one thing that caught my eye was the drive letter in the error

c:\temp\spark-class-launcher-output-4879.txt.

It is smaller case. Checking the code for spark-class2.cmd indicates that it reads out %temp% variable.

Can you run echo %temp% in windows command line to see what is set?

If it is set to lowercase, then simply run set temp=C:\temp

Then run spark-shell with driver-class.

Thanks,

Charles.

Upvotes: 1

Related Questions