Jessica Chambers
Jessica Chambers

Reputation: 1316

Spark-submit looking in wrong directory

I have just installed Anaconda, Apache spark, Pyspark, Scala on a fresh Linux Mint install (all latest versions).

To test the install I have tried running spark-submit in a terminal but I get the following error:

File "/home/jessica/anaconda/bin/find_spark_home.py", line 74, in <module>
    print(_find_spark_home())
  File "/home/jessica/anaconda/bin/find_spark_home.py", line 56, in _find_spark_home
    module_home = os.path.dirname(find_spec("pyspark").origin)
AttributeError: 'NoneType' object has no attribute 'origin'
/home/jessica/anaconda/bin/spark-submit: line 27: /bin/spark-class: No such file or directory

I see that the command is looking in /bin/ instead of in the (correct) /usr/local/spark/bin.

My $PATH variable contains the following: /usr/local/spark/bin:/home/jessica/anaconda/bin:/usr/bin:/bin:/usr/sbin:/sbin:/usr/local/bin::/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games

I also have an env variable called $SPARK_HOME that contains /usr/local/spark/.

How can I tell my system to look in the right directory instead?

Upvotes: 0

Views: 1411

Answers (1)

Jessica Chambers
Jessica Chambers

Reputation: 1316

To fix this error I had to manually set the JAVA_HOME variable in /etc/environment

Upvotes: 1

Related Questions