Reputation: 51
I am trying to run spark-shell command on cmd prompt on Windows 7. I have installed hadoop and have kept it under C:\winutils\hadoop-common-2.2.0-bin-master\bin and Spark under C:\Spark\spark-2.2.1-bin-hadoop2.7\bin.
While executing spark-shell, i am getting following error.
C:\Spark\spark-2.2.1-bin-hadoop2.7\bin>spark-shell The system cannot find the path specified.
Below are my env variables
HADOOP_HOME C:\winutils
JAVA_HOME C:\Program Files\IBM\Java80\jre
PATH C:\Users\IBM_ADMIN\AppData\Local\Programs\Python\Python36-32;C:\IBM\InformationServer\Clients\Classic;C:\Program Files\IBM\Java80\jre;C:\Windows\system32
SCALA_HOME C:\Program Files (x86)\scala\
Screenshot
Upvotes: 5
Views: 16657
Reputation: 11
After setting my env variables, i had the same problem 'The system cannot find the path specified'. I did restart my computer. Now my command spark-shell is working correctly.
Upvotes: 0
Reputation: 1
In my case, all the paths were setup properly like below:
JAVA_HOME = C:\Java
SPARK_HOME = C:\spark-3.3.2-bin-hadoop3
HADOOP_HOME = C:\hadoop-3.3.1
and the path variable updated with below
path = %JAVA_HOME%\bin; %SPARK_HOME%\bin; %HADOOP_HOME%\bin
However, I was still facing the error message of the system cannot find the path specified.
I did following which resolved my issue
You may or may not need to downgrade the version of hadoop and spark. first try restarting the laptop and including quotes in the environment variables (avoid having spaces in your environment variable)
Note: I have java version 17 installed.
Additional information: The error "system cannot find the path specified" usually occurs when the windows is not able to locate the said application in its directory. Windows depends on environment variables to look for this information.
Upvotes: 0
Reputation: 39
I was facing the same issues. The most important point what I did, made a changes in environment variable
earlier I was using JAVA_HOME=C:\java\jdk1.8.0_311\bin that's why I was facing issues.
but it should be
JAVA_HOME=C:\java\jdk1.8.0_31
and spark and Hadoop set as usual. SPARK_HOME=C:\spark\spark-3.0.3-bin-hadoop2.7 HADOOP_HOME=C:\hadoop
under system variable select path and put "C:\java\jdk1.8.0_31\bin , C:\hadoop\bin,C:\spark\spark-3.0.3-bin-hadoop2.7\bin"
Upvotes: 3
Reputation: 17
I had the same issue when using Apache Spark on Windows 10 Pro.
NB:
Uninstall any JAVA JDK above 8 (jdk1.8.0_181)--11-16 caused the problem.
Test the Apache File using 'certutil -hashfile c:\users\username\Downloads\spark-2.7.5-bin-hadoop2.7.tgz SHA512'. Remember to replace 'username' with for instance "certutil -hashfile c:\users*datamind*\Downloads\spark-2.4.5-bin-hadoop2.7.tgz SHA512"
Search for 'Edit Environment Variables'.
C:\Program Files\Java\jdk1.8.0_181
Click on the Path in 'User Variable' ;%JAVA_HOME%\bin
Repeat STEPS 2 and 3 for HADOOP_HOME and JAVA_HOME.
Kindly follow this link and do everything step by step. https://phoenixnap.com/kb/install-spark-on-windows-10
Upvotes: 2
Reputation: 1590
I have the same issue when trying to install Spark local with Windows 7. Please make sure the below paths is correct and I am sure I will work with you.
C:\Program Files\Java\jdk1.8.0_181
;%JAVA_HOME%\bin
C:\spark-2.3.0-bin-hadoop2.7
;%SPARK_HOME%\bin
C:\Hadoop
;%HADOOP_HOME%\bin
Now you can run the cmd and write spark-shell it will work.
Upvotes: 6
Reputation: 221
Your JAVA_HOME is set to JRE, please make sure you point it to your JDK folder (it should be located next to your the JRE)
Upvotes: 5