Pallavi
Pallavi

Reputation: 51

spark-shell The system cannot find the path specified

I am trying to run spark-shell command on cmd prompt on Windows 7. I have installed hadoop and have kept it under C:\winutils\hadoop-common-2.2.0-bin-master\bin and Spark under C:\Spark\spark-2.2.1-bin-hadoop2.7\bin.

While executing spark-shell, i am getting following error.

C:\Spark\spark-2.2.1-bin-hadoop2.7\bin>spark-shell The system cannot find the path specified.

Below are my env variables

HADOOP_HOME C:\winutils

JAVA_HOME   C:\Program Files\IBM\Java80\jre

PATH        C:\Users\IBM_ADMIN\AppData\Local\Programs\Python\Python36-32;C:\IBM\InformationServer\Clients\Classic;C:\Program Files\IBM\Java80\jre;C:\Windows\system32

SCALA_HOME  C:\Program Files (x86)\scala\

Screenshot

Screenshot

Upvotes: 5

Views: 16657

Answers (6)

Antoine-prog
Antoine-prog

Reputation: 11

After setting my env variables, i had the same problem 'The system cannot find the path specified'. I did restart my computer. Now my command spark-shell is working correctly.

Upvotes: 0

skedia8
skedia8

Reputation: 1

In my case, all the paths were setup properly like below:

JAVA_HOME = C:\Java
SPARK_HOME = C:\spark-3.3.2-bin-hadoop3
HADOOP_HOME = C:\hadoop-3.3.1

and the path variable updated with below

path = %JAVA_HOME%\bin; %SPARK_HOME%\bin; %HADOOP_HOME%\bin

However, I was still facing the error message of the system cannot find the path specified.

I did following which resolved my issue

  1. uninstalled the latest version of hadoop i.e hadoop-3.4.1 and spark i.e. spark 3.4. Installed hadoop-3.3.1 and spark 3.3.1 (I suspected that hadoop 3.4.1 didn't go along with Spark 3.4)
  2. restarted my laptop
  3. included double quotes in HOME path

You may or may not need to downgrade the version of hadoop and spark. first try restarting the laptop and including quotes in the environment variables (avoid having spaces in your environment variable)

Note: I have java version 17 installed.

Additional information: The error "system cannot find the path specified" usually occurs when the windows is not able to locate the said application in its directory. Windows depends on environment variables to look for this information.

Upvotes: 0

dillip
dillip

Reputation: 39

I was facing the same issues. The most important point what I did, made a changes in environment variable ​

earlier I was using JAVA_HOME=C:\java\jdk1.8.0_311\bin that's why I was facing issues.

but it should be
JAVA_HOME=C:\java\jdk1.8.0_31

and spark and Hadoop set as usual. SPARK_HOME=C:\spark\spark-3.0.3-bin-hadoop2.7 HADOOP_HOME=C:\hadoop

under system variable select path and put "C:\java\jdk1.8.0_31\bin , C:\hadoop\bin,C:\spark\spark-3.0.3-bin-hadoop2.7\bin"

Upvotes: 3

Amos Bunde
Amos Bunde

Reputation: 17

I had the same issue when using Apache Spark on Windows 10 Pro.

NB:

  1. Uninstall any JAVA JDK above 8 (jdk1.8.0_181)--11-16 caused the problem.

  2. Test the Apache File using 'certutil -hashfile c:\users\username\Downloads\spark-2.7.5-bin-hadoop2.7.tgz SHA512'. Remember to replace 'username' with for instance "certutil -hashfile c:\users*datamind*\Downloads\spark-2.4.5-bin-hadoop2.7.tgz SHA512"

  3. Search for 'Edit Environment Variables'.

  4. C:\Program Files\Java\jdk1.8.0_181

  5. Click on the Path in 'User Variable' ;%JAVA_HOME%\bin

  6. Repeat STEPS 2 and 3 for HADOOP_HOME and JAVA_HOME.

Kindly follow this link and do everything step by step. https://phoenixnap.com/kb/install-spark-on-windows-10

Upvotes: 2

Moustafa Mahmoud
Moustafa Mahmoud

Reputation: 1590

I have the same issue when trying to install Spark local with Windows 7. Please make sure the below paths is correct and I am sure I will work with you.

  1. Create JAVA_HOME variable: C:\Program Files\Java\jdk1.8.0_181
  2. Add the following part to your path: ;%JAVA_HOME%\bin
  3. Create SPARK_HOME variable: C:\spark-2.3.0-bin-hadoop2.7
  4. Add the following part to your path: ;%SPARK_HOME%\bin
  5. The most important part Hadoop path should include bin file before winutils.ee as the following: C:\Hadoop\bin Sure you will locate winutils.exe inside this path.
  6. Create HADOOP_HOME Variable: C:\Hadoop
  7. Add the following part to your path: ;%HADOOP_HOME%\bin

Now you can run the cmd and write spark-shell it will work.

Upvotes: 6

Michal
Michal

Reputation: 221

Your JAVA_HOME is set to JRE, please make sure you point it to your JDK folder (it should be located next to your the JRE)

Upvotes: 5

Related Questions