Vinod Jayachandran
Vinod Jayachandran

Reputation: 3898

java.io.InvalidClassException: org.apache.spark.rpc.netty.RequestMessage

From my application, I am unable to connect to spark master because of the below error

Caused by: java.lang.RuntimeException: java.io.InvalidClassException: org.apache.spark.rpc.netty.RequestMessage; local class incompatible: stream classdesc serialVersionUID = -2221986757032131007, local class serialVersionUID = -5447855329526097695

I understand that there is a version compatibility issue, but unable to resolve it as it seems fine to me. Below is my version info

Application(Tomcat) Java 7

Spark Installation 2.1.0

./spark-shell --version 

Welcome to
         ____              __
        / __/__  ___ _____/ /__    _\ \/ _ \/ _ `/ __/  '_/   /___/ .__/\_,_/_/ /_/\_\   version 2.1.0
         /_/

Using Scala version 2.11.8, Java HotSpot(TM) 64-Bit Server VM, 1.8.0_91 

Maven Dependency

    <groupId>org.apache.spark</groupId>
    <artifactId>spark-core_2.10</artifactId>
    <version>2.1.0</version>

Is it because my application is on Java 7 and Spark 2.1 runs on Java 8 ?

Upvotes: 0

Views: 3164

Answers (2)

Zawsx
Zawsx

Reputation: 1

The Scala version has to match the version in the library, e.g. for Scala 2.11 you should be using spark-core_2.11 library.

Upvotes: 0

vivek mishra
vivek mishra

Reputation: 1162

This is related to class compatibility issue. For Spark 2.1.0, check version of Scala,Hadoop libraries(under jar folder). You need to keep same version with your app. If running via IDE(e.g. eclipse) then ctrl+shift+T and check if multiple version of spark-core are there.

This worked for me.

Upvotes: 0

Related Questions