Raptor0009
Raptor0009

Reputation: 268

How to set 2 different spark versions in .bashrc 2.4.0 and 2.4.4 in local

Need spark 2.4.0 for hbase and spark 2.4.4 for Kafka streaming due to jar dependency of hbase not available for 2.4.4 version.

So i thought we can use if else in .bashrc and use a variable to specially select 2.4.4 and set 2.4.0 as default.

export SPARK_HOME_240_VERSION=/home/user/Softwares/spark-2.4.0-bin-hadoop2.7/
export SPARK_HOME_244_VERSION=/home/user/Softwares/spark-2.4.4-bin-hadoop-2.7-scala-2.12/
export SPARK_CURRENT_VERSION=240

if [ $SPARK_CURRENT_VERSION -eq 244 ]
then
   export SPARK_HOME=$SPARK_HOME_244_VERSION
else 
   export SPARK_HOME=$SPARK_HOME_240_VERSION
fi 

in bash

user@user-VirtualBox:~$ echo $SPARK_HOME
/home/user/Softwares/spark-2.4.0-bin-hadoop2.7/
user@user-VirtualBox:~$ echo $SPARK_CURRENT_VERSION
240
user@user-VirtualBox:~$ export SPARK_CURENT_VERSION=244
user@user-VirtualBox:~$ echo $SPARK_CURRENT_VERSION
240
user@user-VirtualBox:~$ echo $SPARK_HOME
/home/user/Softwares/spark-2.4.0-bin-hadoop2.7/
user@user-VirtualBox:~$ 

Both spark must be configured and one variable must be set to 240 to get default 2.4.0 Spark. That variable value must be changed to 244 to open 2.4.4.

Upvotes: 0

Views: 505

Answers (1)

Nic3500
Nic3500

Reputation: 8601

You could setup aliases to do this, I find it faster than doing exports:

alias 240='export SPARK_HOME=/home/user/Softwares/spark-2.4.0-bin-hadoop2.7/'
alias 244='export SPARK_HOME=/home/user/Softwares/spark-2.4.4-bin-hadoop-2.7-scala-2.12/'

to switch, simply type 240 or 244.

Another alias could be set to quickly see which version is "on":

alias sparkhome=`echo $SPARK_HOME`

Note that there is a typo in your sample, you do:

export SPARK_CURENT_VERSION=244
echo  $SPARK_CURRENT_VERSION

Do you see it? You are missing an 'R' in the export line. So the "code" in your bashrc is never run.

Upvotes: 2

Related Questions