user909481
user909481

Reputation: 150

Upgrade Spark in DSE (DataStax Enterprise) 4.6

Is it possible in DSE 4.6 to upgrade Spark and the corresponding Spark Cassandra Connector from version 1.1.0 to the most recent 1.2.0.

DSE currently ships with Spark and Spark Cassandra Connector 1.1.0 only.

Upvotes: 0

Views: 644

Answers (1)

phact
phact

Reputation: 7305

Short Answer:

You will have to wait for a future version for Spark 1.2 to be included in and supported by DSE.

Alternatively:

You can use DSE's Cassandra with your own Spark 1.2 setup a la Al Tobey

At your own risk:

I did successfully upgrade the connector to a newer version in a previous version of DSE (I think it was .9 to 1.0--you would have to edit the versions and dependencies for 1.2). These are the steps I followed (do this at your own risk, do not attempt in a production setting):

Using the following bash script, create your connector jars (run this as sudo so that the mkdir works):

mkdir /opt/connector
cd /opt/connector

rm *.jar

curl -o ivy-2.3.0.jar   'https://repo1.maven.org/maven2/org/apache/ivy/ivy/2.3.0/ivy-2.3.0.jar'
curl -o spark-cassandra-connector_2.10-1.1.0-beta2.jar 'https://repo1.maven.org/maven2/com/datastax/spark/spark-cassandra-connector_2.10/1.1.0-beta2/spark-cassandra-connector_2.10-1.1.0-beta2.jar'

ivy () { java -jar ivy-2.3.0.jar -dependency $* -retrieve "[artifact]-[revision](-[classifier]).[ext]"; }

ivy org.apache.cassandra cassandra-thrift 2.0.11
ivy com.datastax.cassandra cassandra-driver-core 2.0.6
ivy joda-time joda-time 2.3
ivy org.joda joda-convert 1.6

rm -f *-{sources,javadoc}.jar

Next find your spark lib directory, in my case it's /usr/local/lib/dse/resources/spark/lib/

and copy all the jars you just generated into that directory. Then rename or remove the old connector (keep it around as a backup).

Restart DSE and start spark-shell:

dse spark

To verify which connector is loaded use:

dse spark -verbose

Naturally, you'll have to do this for every node.

Note: I have not actually attempted upgrading the Spark version. It might be a matter of changing the jar but I have not tried it. If you feel like hacking give it a try and let us know!

Upvotes: 2

Related Questions