user1663930
user1663930

Reputation: 315

Permision denied when using pyspark

I get the following error when I try to run a pyspark program:

/usr/local/Cellar/apache-spark/1.6.0/bin/load-spark-env.sh: line 2:   
/usr/local/Cellar/apache-spark/1.6.0/libexec/bin/load-spark-env.sh: Permission denied
/usr/local/Cellar/apache-spark/1.6.0/bin/load-spark-env.sh: line 2: 
exec: /usr/local/Cellar/apache-spark/1.6.0/libexec/bin/load-spark-env.sh: cannot execute: Undefined error: 0

I have tried:

unset SPARK_HOME && spark-submit

but then I get a different error:

KeyError: 'SPARK_HOME

Any idea how to fix this? I am running python 2.7 on OSX 10.11

Upvotes: 2

Views: 7479

Answers (2)

zevij
zevij

Reputation: 2446

Check the permissions on:

/usr/local/Cellar/apache-spark/2.0.2/libexec/bin/load-spark-env.sh

I had a similar issue and that file was marked as 'read only' for me (my user id) and execution rights only to root.

Note that when you invoke pyspark it redirects to the above shell script. So you can trigger the process but as the actual shell script does not have 'x' permissions for you - it fails.

Alternatively, sudo -H pyspark would also do the trick.

Happy Sparking!

Upvotes: 4

Erol
Erol

Reputation: 6526

export SPARK_HOME=/path/to/spark/installation
export PYTHONPATH=$SPARK_HOME/python:$SPARK_HOME/python/build:$PYTHONPATH

Executing above on your terminal allows you to add SPARK_HOME as an environment variable, which is later appended to your PYTHONPATH.

Upvotes: 2

Related Questions