Almond Joy
Almond Joy

Reputation: 119

Apache Spark with pip install not working

I'm trying to install Apache Spark with Python and used the pip install python command in the July 11 release. However, while this successfully installs and I can run

from pyspark import SparkContext

in a python shell, I can't access the pyspark shell by running

pyspark

or spark-submit, with

spark-submit.

The error for both is a 'cannot find the path specified'. I'm on Windows and suspect I'm missing the JAR files for spark. Shouldn't the pip install have taken care of this?

Upvotes: 2

Views: 2482

Answers (1)

timchap
timchap

Reputation: 513

The pyspark and spark-submit location (should be <spark install directory>\bin) are missing from your path.

You can run them by fully specifying their location, navigating to their install location and running pyspark or adding them to your system path.

Upvotes: 1

Related Questions