Felipe
Felipe

Reputation: 7583

Why does Spark with Play fail with "NoClassDefFoundError: Could not initialize class org.apache.spark.SparkConf$"?

I am trying to use this project (https://github.com/alexmasselot/spark-play-activator) as an integration of Play and Spark example to do the same in my project. So, I created an object that starts Spark and a Controller that read a Json file using RDD. Below is my Object that starts Spark:

package bootstrap    
import org.apache.spark.sql.SparkSession    
object SparkCommons {    
  val sparkSession = SparkSession
    .builder
    .master("local")
    .appName("ApplicationController")
    .getOrCreate()
}

and my build.sbt is like this:

import play.sbt.PlayImport._

name := """crypto-miners-demo"""    
version := "1.0-SNAPSHOT"    
lazy val root = (project in file(".")).enablePlugins(PlayScala)    
scalaVersion := "2.12.4"

libraryDependencies += guice
libraryDependencies += evolutions
libraryDependencies += jdbc
libraryDependencies += filters
libraryDependencies += ws

libraryDependencies += "com.h2database" % "h2" % "1.4.194"
libraryDependencies += "com.typesafe.play" %% "anorm" % "2.5.3"
libraryDependencies += "org.scalatestplus.play" %% "scalatestplus-play" % "3.1.0" % Test

libraryDependencies += "com.typesafe.play" %% "play-slick" % "3.0.0"
libraryDependencies += "com.typesafe.play" %% "play-slick-evolutions" % "3.0.0"
libraryDependencies += "org.xerial" % "sqlite-jdbc" % "3.19.3"

libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.2.0"
libraryDependencies += "org.apache.spark" % "spark-sql_2.11" % "2.2.0"

dependencyOverrides += "com.fasterxml.jackson.core" % "jackson-databind" % "2.8.7"

But when I try to call a controller that uses the RDD I get this error on Play framework:

java.lang.NoClassDefFoundError: Could not initialize class org.apache.spark.SparkConf$

I am using the RDD like this: val rdd = SparkCommons.sparkSession.read.json("downloads/tweet-json"). The application that I am trying to copy the configuration is working well. I only could import the jackson-databind lib to my build.sbt. I have an error when I copy libraryDependencies ++= Dependencies.sparkAkkaHadoop and ivyScala := ivyScala.value map { _.copy(overrideScalaVersion = true) } to my build.sbt.

Upvotes: 4

Views: 6582

Answers (1)

Felipe
Felipe

Reputation: 7583

I will write 100000 times on the black board and never forget. Spark 2.2.0 still doesn't work with Scala 2.12. I also edited the Jackson lib version. Below is my build.sbt.

import play.sbt.PlayImport._

name := """crypto-miners-demo"""

version := "1.0-SNAPSHOT"

lazy val root = (project in file(".")).enablePlugins(PlayScala)

scalaVersion := "2.11.8"

libraryDependencies += guice
libraryDependencies += evolutions
libraryDependencies += jdbc
libraryDependencies += filters
libraryDependencies += ws

libraryDependencies += "com.h2database" % "h2" % "1.4.194"
libraryDependencies += "com.typesafe.play" %% "anorm" % "2.5.3"
libraryDependencies += "org.scalatestplus.play" %% "scalatestplus-play" % "3.1.0" % Test

libraryDependencies += "com.typesafe.play" %% "play-slick" % "3.0.0"
libraryDependencies += "com.typesafe.play" %% "play-slick-evolutions" % "3.0.0"
libraryDependencies += "org.xerial" % "sqlite-jdbc" % "3.19.3"

libraryDependencies += "org.apache.spark" %% "spark-core" % "2.2.0"
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.2.0"

dependencyOverrides += "com.fasterxml.jackson.core" % "jackson-databind" % "2.6.5"

Upvotes: 8

Related Questions