AKC
AKC

Reputation: 1023

java.lang.NoClassDefFoundError: org/apache/kafka/common/message/KafkaLZ4BlockOutputStream

I getting NoClassDefFoundErrorerror while using Spark streaming API. Here is my Streaming code.

I know this is a problem with some mising jars and dependencies, but i couldnt figure out exactly what that is.

I am using kafka 0.9.0, spark 1.6.1 - Are these dependecies fine or do i need to change them? I have attached pom.xml below.

Here is the streaming API i am using.

JavaPairInputDStream directKafkaStream = KafkaUtils.createDirectStream(jsc, String.class, byte[].class, StringDecoder.class, DefaultDecoder.class, kafkaParams, topicSet);

here is my code piece. I am receiving error at while(itr.next())

directKafkaStream.foreachRDD(rdd -> {

    rdd.foreachPartition(itr -> {


        try {

            while (itr.hasNext()) {

java.lang.NoClassDefFoundError: org/apache/kafka/common/message/KafkaLZ4BlockOutputStream

Here is my POM.xml

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>

    <parent>
        <groupId>com.abcd.rep.xyz</groupId>
        <artifactId>xyz</artifactId>
        <version>1.0</version>
        <relativePath>../pom.xml</relativePath>
    </parent>
        <artifactId>SparkPOC</artifactId>
    <version>0.0.1-SNAPSHOT</version>
    <packaging>jar</packaging>
        <name>SparkPOCde</name>
        <url>http://maven.apache.org</url>
<properties>

<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>


<spark-version>1.6.1</spark-version>

<kafka-version>0.9.0.0</kafka-version>

</properties>


<dependencies>

<!-- http://mvnrepository.com/artifact/org.springframework/spring-core -->



<!-- http://mvnrepository.com/artifact/org.springframework/spring-jdbc -->
<dependency>

<groupId>log4j</groupId>

<artifactId>log4j</artifactId>

<version>1.2.17</version>

</dependency>

<!-- http://mvnrepository.com/artifact/org.apache.spark/spark-streaming_2.10 -->

<dependency>
<groupId>org.apache.spark</groupId>

<artifactId>spark-streaming_2.10</artifactId>

<version>${spark-version}</version>

</dependency>

<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-streaming-kafka_2.10 -->



<dependency>

<groupId>org.apache.spark</groupId>

<artifactId>spark-streaming-kafka_2.10</artifactId>

<version>1.6.2</version>

 <exclusions>
        <exclusion>
            <groupId>io.netty</groupId>
            <artifactId>netty</artifactId>
        </exclusion>

        <exclusion>
            <groupId>io.jboss.netty</groupId>
            <artifactId>netty</artifactId>
        </exclusion>
    </exclusions> 

</dependency>

   <dependency>
            <groupId>com.abcd.rep.xyz</groupId>
            <artifactId>xyzCommon</artifactId>
            <version>1.0</version>
            <type>jar</type>
        </dependency>


<!-- http://mvnrepository.com/artifact/ojdbc/ojdbc -->


<!-- <dependency> <groupId>ojdbc</groupId> <artifactId>ojdbc</artifactId> <version>14</version> </dependency>-->


<!-- https://mvnrepository.com/artifact/org.mongodb/mongo-java-driver -->




<!-- http://mvnrepository.com/artifact/org.springframework.data/spring-data-mongodb -->




<!-- https://mvnrepository.com/artifact/com.googlecode.json-simple/json-simple -->





</dependencies>


<build>

<finalName>appname</finalName>


<resources>


<resource>

<directory>src/main/resources</directory>


<excludes>

<exclude>eventRules.json</exclude>

<exclude>log4j.xml</exclude>

<exclude>resources.properties</exclude>

</excludes>

</resource>

</resources>


<plugins>


<plugin>

<groupId>org.apache.maven.plugins</groupId>

<artifactId>maven-compiler-plugin</artifactId>

<version>3.5.1</version>


<configuration>

<source>1.8</source>

<target>1.8</target>

</configuration>

</plugin>


<plugin>

<groupId>org.apache.maven.plugins</groupId>

<artifactId>maven-shade-plugin</artifactId>

<version>2.4.3</version>


<executions>


<execution>

<phase>package</phase>


<goals>

<goal>shade</goal>

</goals>


<configuration>


<filters>


<filter>

<artifact>*:*</artifact>


-<excludes>

<exclude>META-INF/*.SF</exclude>

<exclude>META-INF/*.DSA</exclude>

<exclude>META-INF/*.RSA</exclude>

</excludes>

</filter>

</filters>


<transformers>


-<transformer implementation="org.apache.maven.plugins.shade.resource.AppendingTransformer">

<resource>reference.conf</resource>

</transformer>


-<transformer implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">

<mainClass>com.abcd.rep.xyz.SparkPOCde.EventConsumerServiceImpl</mainClass>

</transformer>

</transformers>

</configuration>

</execution>

</executions>

</plugin>

</plugins>

</build>

</project>

Upvotes: 0

Views: 27787

Answers (3)

AKC
AKC

Reputation: 1023

I used kafka jar for version 0.8.2.2 to resolve this issue.

Upvotes: 0

Matiji66
Matiji66

Reputation: 737

Though my kafka cluster version is 0.9.0.0 . And I use maven pom like this to process kafka with Spark Streaming.

<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.10</artifactId>
<version>1.6.0</version>
</dependency>

,but I get error as above described. Then I try to add dependence as follow and it works.

      <dependency>
        <groupId>org.apache.kafka</groupId>
        <artifactId>kafka_2.10</artifactId>
        <version>0.8.2.1</version>
    </dependency>
    <dependency>
        <groupId>org.apache.kafka</groupId>
        <artifactId>kafka-clients</artifactId>
        <version>0.8.2.1</version>
    </dependency>

Upvotes: 1

gile
gile

Reputation: 6006

KafkaLZ4BlockOutputStream is in kafka-clients jar.

Till kafka-clients version 0.8.2.2 it is in org/apache/kafka/common/message/KafkaLZ4BlockOutputStream

From 0.9.0.0 it is in /org/apache/kafka/common/record/

Upvotes: 4

Related Questions