Reputation: 319
I am using the below blog in configuring to access Cassandra from apache spark.
"http://www.datastax.com/dev/blog/accessing-cassandra-from-spark-in-java" "https://gist.github.com/jacek-lewandowski/278bfc936ca990bee35a#file-javademo-java-L177"
However, I am not able to import the CassandraJavaUtil class dependency and my eclipse is displaying an error "The Import cannot be resolved."
import static com.datastax.spark.connector.CassandraJavaUtil.*;
Please help me in resolving this error.
Many thanks.
Upvotes: 7
Views: 1614
Reputation: 550
The class CassandraJavaUtil
has now been moved to the japi
package in com.datastax.spark.connector
So, try using:-
import static com.datastax.spark.connector.japi.CassandraJavaUtil.*;
Note: As per this doc :
https://github.com/datastax/spark-cassandra-connector/blob/master/doc/7_java_api.md
Since version 1.1.x, Java API comes with several useful factory methods which can be used to create factories of row readers of the two major kinds: type converter based and column mapper based.
Also note that the syntax to use CassandraJavaUtil.javaFunctions()
has also changed. Go through the above reference carefully.
Upvotes: 4
Reputation: 57748
I also followed the example in the first document that you linked. You'll notice that in the "Prerequisites" section, step #2 requires you to create the example as a Maven project. Step #3 lists four dependencies that you need to add to your project. Two of those dependencies are specific to the Spark Connector:
Basically, the "dependencies" section of the pom.xml
for my Spark projects looks like this:
<dependencies>
<dependency>
<groupId>com.datastax.spark</groupId>
<artifactId>spark-cassandra-connector_2.10</artifactId>
<version>1.1.0-alpha2</version>
</dependency>
<dependency>
<groupId>com.datastax.spark</groupId>
<artifactId>spark-cassandra-connector-java_2.10</artifactId>
<version>1.1.0-alpha2</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>1.1.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.10</artifactId>
<version>1.1.0</version>
</dependency>
</dependencies>
Double-check that your pom.xml
has those dependencies, and then invoke Maven to bring the Spark Connector libraries down locally. This worked for me:
cd workspace/sparkTest2
mvn package
Upvotes: 8