Mostafa
Mostafa

Reputation: 1551

KeyError: 'SPARK_HOME' in pyspark

when I try:

from pyspark import SparkContext, SparkConf
sc=SparkContext()

I get:

KeyError: 'SPARK_HOME'

What is the solution?

Upvotes: 3

Views: 3136

Answers (1)

Slater Victoroff
Slater Victoroff

Reputation: 21904

pyspark relies on the spark SDK. You need to have that installed before using pyspark.

Once that's set you need to set the environment variable SPARK_HOME to tell pyspark where to look for your spark installation. If you're on a *nix system you can do so by adding the follow to your .bashrc

export SPARK_HOME=<location of spark install>

If you're using Windows there's a convoluted way of setting variables via GUI here. Through DOS you can use set in the place of export:

SET SPARK_HOME=<location of spark install>

Upvotes: 3

Related Questions