L.S.
L.S.

Reputation: 31

Apache Spark with Hadoop distribution failing to run on Windows

I tried running spark-1.5.1-bin-hadoop2.6 distribution (and newer versions of Spark with same results) on Windows using Cygwin.
When trying to execute spark-shell script in the bin folder, I get below output: Error: Could not find or load main class org.apache.spark.launcher.Main

I tried to set CLASSPATH to the location of lib/spark-assembly-1.5.1-hadoop2.6.0.jar but to no avail.

(FYI: I am able to run the same distribution fine on my MAC with no extra setup steps required)

Please assist in finding resolution for Cygwin execution on Windows.

Upvotes: 3

Views: 2385

Answers (2)

Yosef Weiner
Yosef Weiner

Reputation: 5751

My solution to the problem was to move the Spark installation into a path that didn't have spaces in it. Under Program Files I got the above error, but moving it directly under C:\ and running spark-shell.bat file cleared it up.

Upvotes: 0

jlb
jlb

Reputation: 689

I ran into and solved a similar problem with cywin on Windows 10 and spark-1.6.0.

  1. build with Maven (maybe you're past this step)

    mvn -DskipTests package

  2. make sure JAVA_HOME is set to a JDK

    $ export JAVA_HOME="C:\Program Files\Java\jdk1.8.0_60"

    $ ls "$JAVA_HOME"

bin include LICENSE THIRDPARTYLICENSEREADME.txt ....

  1. use the Windows batch file. Launch from PowerShell or CommandPrompt if you have terminal problems with cygwin.

    $ chmod a+x bin/spark-shell.cmd

    $ ./bin/spark-shell.cmd

Upvotes: 4

Related Questions