Pricey
Pricey

Reputation: 81

Unable to run spark-shell from terminal on MacOS

I was trying to run spark-shell from terminal on MacOS using ./Documents/spark/spark-3.0.0-bin-hadoop2.7/bin/spark-shell and it started running but I wanted to run it using just spark-shell.

I saw a 4-minute-video that shows how it's done but it isn't working for me.

I don't fully understand how ~/.bash_profile work but below is how it looks like:

# added by Anaconda3 5.3.1 installer
# >>> conda init >>>
# !! Contents within this block are managed by 'conda init' !!

__conda_setup="$(CONDA_REPORT_ERRORS=false '/Users/ajay/anaconda3/bin/conda' shell.bash hook 2> /dev/null)"
if [ $? -eq 0 ]; then
    \eval "$__conda_setup"
else
    if [ -f "/Users/ajay/anaconda3/etc/profile.d/conda.sh" ]; then
        . "/Users/ajay/anaconda3/etc/profile.d/conda.sh"
        CONDA_CHANGEPS1=false conda activate base
    else
        \export PATH="/Users/ajay/anaconda3/bin:$PATH"
    fi
fi
unset __conda_setup
# <<< conda init <<<
export SPARK_HOME=/Users/ajay/Documents/spark/spark-3.0.0-bin-hadoop2.7
export PATH=$PATH:$SPARK_HOME/bin
PATH="/usr/local/bin:/Users/ajay/anaconda3/bin:/usr/local/bin:/usr/bin:/bin:/usr/sbin:/sbin"

The $PATH gives /usr/local/bin:/Users/ajay/anaconda3/bin:/usr/local/bin:/usr/bin:/bin:/usr/sbin:/sbin

How do I need to change the ~/.bash_profile for spark-shell to work?

EDIT

Here is the message that I get with on running ./Documents/spark/spark-3.0.0-bin-hadoop2.7/bin/spark-shell

20/08/27 16:51:16 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
Spark context Web UI available at http://192.168.0.2:4040
Spark context available as 'sc' (master = local[*], app id = local-1598527288778).
Spark session available as 'spark'.

On running spark-shell it shows : -bash: spark-shell: command not found

Upvotes: 1

Views: 2286

Answers (1)

user156548
user156548

Reputation:

export PATH=$PATH:$SPARK_HOME/bin
PATH="/usr/local/bin:/Users/ajay/anaconda3/bin:/usr/local/bin:/usr/bin:/bin:/usr/sbin:/sbin"

These two lines are in the wrong order, in that you "add" the spark installation to the $PATH and then immediately overwrite $PATH.

You might prefer to have something like:

export PATH="$SPARK_HOME/bin:/Users/ajay/anaconda3/bin:/usr/local/bin:$PATH"

Don't forget that any changes to .bash_profile, .profile, .bashrc will only take effect in a new shell (unless you manually load them).

Upvotes: 1

Related Questions