turbo
turbo

Reputation: 589

Multiple Apache spark versions with cassandra

I am using spark-cassandra-connector. I have installed Spark 1.3.1 and Cassandra 2.0.14. For some reason want to try Spark 1.2.2. How should i configure my elipse java project to use spark 1.2.2 instead of 1.3.1. These are the only configuration settings in app :

     String cassandraHost = "127.0.0.1";
     SparkConf conf = new SparkConf(true);
     conf.set("spark.cassandra.connection.host", cassandraHost);
     conf.set("spark.cleaner.ttl", "3600");
     conf.setMaster("local");
     conf.setAppName(appName);

     JavaSparkContext context = new JavaSparkContext(conf);

Upvotes: 0

Views: 59

Answers (1)

yjshen
yjshen

Reputation: 6693

I think you should create a maven project and declare the spark dependency's version a property like this:

<dependency>
  <groupId>org.apache.spark</groupId>
  <artifactId>spark-sql_${scala.binary.version}</artifactId>
  <version>${spark.version}</version>
  <scope>provided</scope>
</dependency>

And change the spark.version accordingly before you build a jar.

Optionally, you could create two profiles to overwrite the spark.version property above:

<profile>
  <id>spark-1.2</id>
  <properties>
    <spark.version>1.2.2</spark.version>
  </properties>
</profile>
<profile>
  <id>spark-1.3</id>
  <properties>
    <spark.version>1.3.1</spark.version>
  </properties>
</profile>

and use different profile accordingly.

Upvotes: 1

Related Questions