Reputation: 1551
when I try:
from pyspark import SparkContext, SparkConf
sc=SparkContext()
I get:
KeyError: 'SPARK_HOME'
What is the solution?
Upvotes: 3
Views: 3136
Reputation: 21904
pyspark
relies on the spark SDK. You need to have that installed before using pyspark
.
Once that's set you need to set the environment variable SPARK_HOME
to tell pyspark
where to look for your spark
installation. If you're on a *nix system you can do so by adding the follow to your .bashrc
export SPARK_HOME=<location of spark install>
If you're using Windows there's a convoluted way of setting variables via GUI here. Through DOS you can use set
in the place of export
:
SET SPARK_HOME=<location of spark install>
Upvotes: 3