Marko Taht
Marko Taht

Reputation: 1522

Spark on windows 10 not working

Im trying to get spark working on win10. When i try to run spark shell i get this error :

'Spark\spark-2.0.0-bin-hadoop2.7\bin..\jars""\ is not recognized as an internal or external command,operable program or batch file.

Failed to find Spark jars directory. You need to build Spark before running this program.

I am using a pre-built spark for hadoop 2.7 or later. I have installed java 8, eclipse neon, python 2.7, scala 2.11, gotten winutils for hadoop 2.7.1 And i still get this error.

When I donwloaded spark it comes in the tgz, when extracted there is another tzg inside, so i extracted it also and then I got all the bin folders and stuff. I need to access spark-shell. Can anyone help?

EDIT: Solution i ended up using:

1) Virtual box

2) Linux mint

Upvotes: 1

Views: 4866

Answers (3)

Vid123
Vid123

Reputation: 11

"On Windows, I found that if it is installed in a directory that has a space in the path (C:\Program Files\Spark) the installation will fail. Move it to the root or another directory with no spaces." OR If you have installed Spark under “C:\Program Files (x86)..” replace 'Program Files (x86)' with Progra~2 in the PATH env variable and SPARK_HOME user variable.

Upvotes: 0

Priti Singh
Priti Singh

Reputation: 91

I got the same error while building Spark. You can move the extracted folder to C:\

Refer this: http://techgobi.blogspot.in/2016/08/configure-spark-on-windows-some-error.html

Upvotes: 3

Ani Menon
Ani Menon

Reputation: 28199

You are probably giving the wrong folder path to Spark bin.

Just open the command prompt and change directory to the bin inside the spark folder.

Type spark-shell to check.

Refer: Spark on win 10

Upvotes: 1

Related Questions