Rohit Dhiman
Rohit Dhiman

Reputation: 93

pom.xml dependencies for spark while using scala 2.12.10

These Apache Spark dependencies are not working, while working with scala 2.12.10

<dependencies>
    <dependency>
        <groupId>org.scala-lang</groupId>
        <artifactId>scala-library</artifactId>
        <version>2.12.10</version>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_2.12</artifactId>
        <version>3.0.1</version>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-sql_2.12</artifactId>
        <version>3.0.1</version>
    </dependency>
</dependencies>

Error while running spark app from IntelliJ

Exception in thread "main" java.lang.NoSuchMethodError: scala.Product.$init$(Lscala/Product;)V at org.apache.spark.SparkConf$DeprecatedConfig.(SparkConf.scala:784) at org.apache.spark.SparkConf$.(SparkConf.scala:605) at org.apache.spark.SparkConf$.(SparkConf.scala) at org.apache.spark.SparkConf.set(SparkConf.scala:94) at org.apache.spark.SparkConf.set(SparkConf.scala:83) at org.apache.spark.SparkConf.setMaster(SparkConf.scala:115) at org.apache.spark.SparkContext$.updatedConf(SparkContext.scala:2717) at org.apache.spark.SparkContext.(SparkContext.scala:153)

However, this set of dependencies work perfectly fine with the same spark app.

    <dependency>
        <groupId>org.scala-lang</groupId>
        <artifactId>scala-library</artifactId>
        <version>2.11.8</version>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_2.11</artifactId>
        <version>2.4.1</version>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-sql_2.11</artifactId>
        <version>2.4.1</version>
    </dependency>

Code snippet -

import org.apache.spark.SparkContext
import org.apache.log4j.{Level, Logger}
object Testing1
    {
      def main(args : Array[String]): Unit = {
        Logger.getLogger("org").setLevel(Level.OFF)
        val sc = new SparkContext("local[*]" , "SparkDemo")
        val lines = sc.textFile("sample.txt");
        val words = lines.flatMap(line => line.split(' '))
        val wordsKVRdd = words.map(x => (x,1))
        val count = wordsKVRdd.reduceByKey((x,y) => x + y).map(x => (x._2,x._1)).sortByKey(false).map(x => (x._2, x._1)).take(10)
        count.foreach(println)
      }
    }

Upvotes: 2

Views: 2578

Answers (2)

Rohit Dhiman
Rohit Dhiman

Reputation: 93

It started working after I added scala 2.12.10 sdk in the module settings in IntelliJ. Also, I deleted scala 2.11.8 sdk from the module/project settings.

Upvotes: 1

Alex Ott
Alex Ott

Reputation: 87249

This errors says about Scala version incompatibility. You either have another dependency that depends on the Scala 2.11, or you just need to do mvn clean to get rid of the old classes compiled with Scala 2.11. Also check the version of Scala configured in the Project's settings.

Upvotes: 2

Related Questions