kng
kng

Reputation: 301

How to switch to an older pyspark version?

I have pyspark 2.4.4 installed on my Mac.

➜  ~ pyspark --version
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 2.4.4
      /_/

Using Scala version 2.11.12, OpenJDK 64-Bit Server VM, 1.8.0_222
Branch
Compiled by user  on 2019-08-27T21:21:38Z
Revision
Url
Type --help for more information.

I need to revert back to an older version 2.3.2.

➜  ~ pip install pyspark==2.3.2
Collecting pyspark==2.3.2
Requirement already satisfied: py4j==0.10.7 in /workspace/anaconda3/lib/python3.6/site-packages (from pyspark==2.3.2) (0.10.7)
Installing collected packages: pyspark
  Found existing installation: pyspark 2.4.4
    Can't uninstall 'pyspark'. No files were found to uninstall.
Successfully installed pyspark-2.3.2

While running above command seems like it installs pyspark-2.3.2, but doesn't overwrite the existing pyspark 2.4.4 version. And when I check at this path /usr/local/Cellar/apache-spark/ I only see 2.4.4 sub-directory, I do not want to delete that directory since it contains all the configurations at libexec/conf/spark-defaults.conf and jars at libexec/jars.

Is there a neat and robust way to switch apache-spark versions on Mac ?

Thanks in advance.

Upvotes: 6

Views: 11122

Answers (1)

Nrg
Nrg

Reputation: 21

Try --force-reinstall flag. Your command should look like

pip install --force-reinstall pyspark==2.3.2

Upvotes: 2

Related Questions