mjbsgll
mjbsgll

Reputation: 742

Spark and Java: Error ClassCastException

I am following exactly this example https://github.com/rathboma/hadoop-framework-examples/tree/master/spark When I try to run, I get this message:

java.lang.ClassCastException: org.apache.spark.api.java.Optional cannot be cast to com.google.common.base.Optional

I do not how can I fix it, because I am newbie using Spark. Thanks!! Any suggestions?

Upvotes: 1

Views: 455

Answers (1)

zsxwing
zsxwing

Reputation: 20826

This is because you use Spark 1.x to compile codes but run your application in Spark 2.x cluster. You can update pom.xml to use the same version of your Spark cluster and probably need to update your codes as well because 2.x and 1.x are not compatible.

Upvotes: 1

Related Questions