sat
sat

Reputation: 25

Cassandra Spark Connector - NoSuchMethodError: scala.runtime.ObjectRef.zero()Lscala/runtime/ObjectRef

I am trying to connect spark with cassandra database but I got the error mentioned below. I think there should some mismatch with the versions.

code:

    SparkConf conf = new SparkConf().setAppName("kafka-sandbox").setMaster("local[2]");
    conf.set("spark.cassandra.connection.host", "192.168.34.1");//connection for cassandra database
    JavaSparkContext sc = new JavaSparkContext(conf);
    CassandraConnector connector = CassandraConnector.apply(sc.getConf());
    final Session session = connector.openSession();//error in this line
    final PreparedStatement prepared = session.prepare("INSERT INTO spark_test5.messages JSON?");

error:


    Exception in thread "main" java.lang.NoSuchMethodError: scala.runtime.ObjectRef.zero()Lscala/runtime/ObjectRef;
        at com.datastax.spark.connector.cql.CassandraConnector$.com$datastax$spark$connector$cql$CassandraConnector$$createSession(CassandraConnector.scala)
        at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$3.apply(CassandraConnector.scala:149)
        at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$3.apply(CassandraConnector.scala:149)
        at com.datastax.spark.connector.cql.RefCountedCache.createNewValueAndKeys(RefCountedCache.scala:31)
        at com.datastax.spark.connector.cql.RefCountedCache.acquire(RefCountedCache.scala:56)
        at com.datastax.spark.connector.cql.CassandraConnector.openSession(CassandraConnector.scala:82)

pom.xml:

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
  <modelVersion>4.0.0</modelVersion>
  <groupId>SparkPoc</groupId>
  <artifactId>Spark-Poc</artifactId>
  <version>0.0.1-SNAPSHOT</version>
  <dependencies>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-streaming_2.10</artifactId>
        <version>2.0.0</version>
        <scope>provided</scope>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_2.10</artifactId>
        <version>2.0.1</version>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-streaming-kafka-0-8_2.10</artifactId>
        <version>2.0.0</version>
    </dependency>
    <dependency>
        <groupId>com.datastax.spark</groupId>
        <artifactId>spark-cassandra-connector_2.11</artifactId>
        <version>2.0.0-M3</version>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-sql_2.11</artifactId>
        <version>2.0.1</version>
    </dependency> 
  </dependencies>
<build>
    <plugins>
    <plugin>
        <groupId>org.apache.maven.plugins</groupId>
        <artifactId>maven-compiler-plugin</artifactId>
        <version>3.3</version>
        <configuration>
            <source>1.8</source>
            <target>1.8</target>
        </configuration>
    </plugin>
    <plugin>
        <groupId>org.apache.maven.plugins</groupId>
        <artifactId>maven-assembly-plugin</artifactId>
        <version>2.4.1</version>
        <configuration>
            <!-- get all project dependencies -->
            <descriptorRefs>
                    <descriptorRef>jar-with-dependencies</descriptorRef>
            </descriptorRefs>
            <!-- MainClass in mainfest make a executable jar -->
            <archive>
                    <manifest>
                            <mainClass>com.nwf.Consumer</mainClass>
                    </manifest>
            </archive>
        </configuration>
        <executions>
            <execution>
                    <id>make-assembly</id>
                    <!-- bind to the packaging phase -->
                    <phase>package</phase>
                    <goals>
                            <goal>single</goal>
                    </goals>
            </execution>
    </executions>
    </plugin>
    </plugins>
</build>
</project>

Spark version : version 2.0.0

Scala version : version 2.11.8

Upvotes: 1

Views: 831

Answers (3)

sat
sat

Reputation: 25

In my pom.xml I changed sacala version from 2.10 to 2.11.
Given below is the updated pom.xml


----------
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
  <modelVersion>4.0.0</modelVersion>
  <groupId>SparkPoc</groupId>
  <artifactId>Spark-Poc</artifactId>
  <version>0.0.1-SNAPSHOT</version>
  <dependencies>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-streaming_2.11</artifactId>
        <version>2.0.0</version>
        <scope>provided</scope>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_2.11</artifactId>
        <version>2.0.1</version>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-streaming-kafka-0-8_2.11</artifactId>
        <version>2.0.0</version>
    </dependency>
    <dependency>
        <groupId>com.datastax.spark</groupId>
        <artifactId>spark-cassandra-connector_2.11</artifactId>
        <version>2.0.0-M3</version>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-sql_2.11</artifactId>
        <version>2.0.1</version>
    </dependency> 
  </dependencies>
<build>
    <plugins>
    <plugin>
        <groupId>org.apache.maven.plugins</groupId>
        <artifactId>maven-compiler-plugin</artifactId>
        <version>3.3</version>
        <configuration>
            <source>1.8</source>
            <target>1.8</target>
        </configuration>
    </plugin>
    <plugin>
        <groupId>org.apache.maven.plugins</groupId>
        <artifactId>maven-assembly-plugin</artifactId>
        <version>2.4.1</version>
        <configuration>
            <!-- get all project dependencies -->
            <descriptorRefs>
                    <descriptorRef>jar-with-dependencies</descriptorRef>
            </descriptorRefs>
            <!-- MainClass in mainfest make a executable jar -->
            <archive>
                    <manifest>
                            <mainClass>com.nwf.Consumer</mainClass>
                    </manifest>
            </archive>
        </configuration>
        <executions>
            <execution>
                    <id>make-assembly</id>
                    <!-- bind to the packaging phase -->
                    <phase>package</phase>
                    <goals>
                            <goal>single</goal>
                    </goals>
            </execution>
    </executions>
    </plugin>
    </plugins>
</build>
</project>

Upvotes: 0

Alex Karpov
Alex Karpov

Reputation: 564

According to your pom.xml you mix scala version for different dependencies:

  • spark-streaming_2.10
  • spark-core_2.10
  • spark-streaming-kafka-0-8_2.10
  • spark-cassandra-connector_2.11
  • spark-sql_2.11

All dependencies should have same scala version. Please, try to change everything to _2.11.

Upvotes: 0

Sam Elamin
Sam Elamin

Reputation: 245

zero() on scala.runtime.VolatileObjectRef has been introduced in Scala 2.11 You probably have a library compiled against Scala 2.11 and running on a Scala 2.10 runtime.

See

v2.10: https://github.com/scala/scala/blob/2.10.x/src/library/scala/runtime/VolatileObjectRef.java v2.11: https://github.com/scala/scala/blob/2.11.x/src/library/scala/runtime/VolatileObjectRef.java

Upvotes: 1

Related Questions