Andy
Andy

Reputation: 441

not able to run pyspark from cmd form spark directory

I have installed spark 1.6(pre built for hadoop 2.6) version in my windows 10 system and I have set up the environmental variables properly. When I to run pyspark I get this error message.

However I can run "python" command from spark directory and its returning correct version.

Can anyone help me to solve this?

Upvotes: 1

Views: 620

Answers (1)

Sahil Desai
Sahil Desai

Reputation: 3696

When you run python it directly enter into python command-line but for pyspark you have to execute pyspark executable file which is not present in this location. You are trying into C:\spark but pyspark file present into this location C:\spark\bin\pyspark so you need to go on this location and try to run pyspark.

Upvotes: 2

Related Questions