Reputation: 323
I tried installing Apache Spark on my 64 bit Windwos 7 machine.
I used the guides -
This is what I did -
Install Scala Set environment variable SCALA_HOME and add %SCALA_HOME%\bin to Path Result: scala command works on command prompt
Unpack pre-built Spark Set environment variable SPARK_HOME and add %SPARK_HOME%\bin to Path
Download winutils.exe Place winutils.exe under C:/hadoop/bin Set environment variable HADOOP_HOME and add %HADOOP_HOME%\bin to Path
I already have JDK 8 installed.
Now, the problem is, when I run spark-shell from C:/spark-2.1.1-bin-hadoop2.7/bin, I get this -
"C:\Program Files\Java\jdk1.8.0_131\bin\java" -cp "C:\spark-2.1.1-bin-hadoop2.7\conf\;C:\spark-2.1.1-bin-hadoop2.7\jars\*" "-Dscala.usejavacp=true" -Xmx1g org spark.repl.Main --name "Spark shell" spark-shell
Is it an error? Am I doing something wrong?
Thanks!
Upvotes: 2
Views: 1480
Reputation: 1590
I have the same issue when trying to install Spark local with Windows 7. Please make sure the below paths is correct and I am sure I will work with you.
Now you can run the cmd and write spark-shell it will work.
Upvotes: 0