jtitusj
jtitusj

Reputation: 3086

How to Build spark to run on existing Hadoop 2.6 Cluster from CDH

I have an existing Hadoop cluster with version hadoop version Hadoop 2.6.0-cdh5.4.2. I have an existing Spark (version 1.5.1) running on that cluster. However, I want to use Spark 2.0 / Spark 2.1 with some code modifications.

Update

I have found out from cloudera forums that in theory, I could just download Spark-2.0 (+ Hadoop-2.6), changed HADOOP_CONF_DIR on conf/spark-env.sh and do something like

./bin/spark-shell --master yarn

and basically, I will have Spark-2.0 running on my cluster. However, it still doesn't work. I'm running out of potential solutions that's why I came here.

Upvotes: 0

Views: 262

Answers (1)

Justin Kestelyn
Justin Kestelyn

Reputation: 934

A Spark 2.0 Beta will be available from Cloudera soon. You'll be able to easily install it on Cloudera Manager-managed clusters via a parcel.

Upvotes: 0

Related Questions