pac
pac

Reputation: 501

Running multiple Spark versions on one server

I wish to deploy two versions of Spark on the one server. Initially I had version 2.2.0 deployed.

Now I also have 2.0.1 deployed but when I run start-master.sh in its sbin folder version 2.2.0 is started. This is presumably because SPARK_HOME is still set to 2.2.0.

Is there any way I can configure SPARK_HOME so both versions will work?

I'm not sure it this makes any difference but I don't plan on having both versions run at the same time.

Upvotes: 0

Views: 866

Answers (1)

Radhakrishnan Rk
Radhakrishnan Rk

Reputation: 561

To manage multiple versions of spark in a single hadoop cluster, We can easily manage the spark service and its configuration with CDH parcels.

https://www.cloudera.com/documentation/spark2/latest/topics/spark2_installing.html

You should configure spark shuffle service when we are about to use multiple spark versions in single cluster.

https://www.cloudera.com/documentation/spark2/latest/topics/spark2_requirements.html

http://spark.apache.org/docs/latest/job-scheduling.html#configuration-and-setup

Upvotes: 1

Related Questions