Reputation: 1071
I'm trying to install Apache Spark on Windows 10. I downloaded Spark and winutils.exe, set the SPARK_HOME, HADOOP_HOME, and updated the PATH variable to include Spark bin path. Still, when I run spark-shell I get the error below. What's the problem?
C:\tools\spark-2.1.1-bin-hadoop2.7\bin>spark-shell
'""C:\Program' is not recognized as an internal or external command,
operable program or batch file.
Upvotes: 1
Views: 2629
Reputation: 966
You can also set JAVA_HOME as "C:\Progra~1\Microsoft\jdk-11.0.22.7-hotspot" to get rid of the spaces.
SET JAVA_HOME="C:\Progra~1\Microsoft\jdk-11.0.22.7-hotspot\"
DIR %JAVA_HOME%
Upvotes: 0
Reputation: 11
i had the same problem. just take that spark folder and paste it in C: and specify the path in the environment variable. It should work.
Upvotes: 0
Reputation: 36
What I have figured out after trying for long time and going through different articles, that this issue is related to setting up the environment variables correctly.Things are actually simple and just need to get the set up right to see your spark shell working, below are the steps mentioned to get it right and working.
Install Java (1.7+) under "C" directory or under a directory where spaces our not there between the full path. Like I have installed java 1.8 version under "C:/Java" path. But if you have already installed java under "Program Files"/"Program Files(86)", you need to put both JAVA_HOME and PATH Variables with double quotes, like JAVA_HOME="C:\Program Files\Java" and PATH="%JAVA_HOME\bin%".
C:\Users\ankitthakur>java -version java version "1.8.0_131" Java(TM) SE Runtime Environment (build 1.8.0_131-b11) Java HotSpot(TM) 64-Bit Server VM (build 25.131-b11, mixed mode).
Install Scala under "C:\Scala" just for consistency or you can choose any other directory. Set SCALA_HOME and add into PATH variable.
C:\Users\ankitthakur>scala -version Scala code runner version 2.11.8 -- Copyright 2002-2016, LAMP/EPF
Install SBT under "C:\Sbt" and similarly set up the SBT_HOME and PATH.
Download Spark from the below link. please remember to download a pre-built version for Hadoop otherwise you need to build your downloaded source code and you can do this through maven if you have installed already otherwise download and install it. Place it under "C:\Spark" directory just for consistency and set up SPARK_HOME and PATH. Donwload Path : http://spark.apache.org/downloads.html
finalllyyyy!!!!!, we are done here, it seems to lengthy but it is not actually believe me, but make sure you have everything properly in place. Please have a loot at the environment variables snapshot on my machine. environment variables set for setting up spark.
Upvotes: 2