codeBarer
codeBarer

Reputation: 2378

How do you properly set up Scala Spark libraryDependencies with the correct version of Scala?

I'm new to Scala Spark and I'm trying to create an example project using Intellij. During Project creation I choose Scala and Sbt with Scala version 2.12 but When I tried adding spark-streaming version 2.3.2 if kept erroring out so I Google'd around and on Apache's website I found the sbt config shown below and I'm still getting the same error.

Error: Could not find or load main class SparkStreamingExample
Caused by: java.lang.ClassNotFoundException: SparkStreamingExample

How can it be determined which version of Scala works with which version of Spark Dependencies?

name := "SparkStreamExample"

version := "0.1"

scalaVersion := "2.11.8"

libraryDependencies ++= Seq(

  "org.apache.spark" % "spark-streaming_2.11" % "2.3.2"
)

My Object class is very basic doesn't have much to it...

import org.apache.spark.SparkConf
import org.apache.spark.streaming.StreamingContext

object SparkStreamingExample extends App {
    println("SPARK Streaming Example")
}

Upvotes: 1

Views: 196

Answers (1)

euclio
euclio

Reputation: 1487

You can see the version of Scala that is supported by Spark in the Spark documentation.

As of this writing, the documentation says:

Spark runs on Java 8+, Python 2.7+/3.4+ and R 3.1+. For the Scala API, Spark 2.3.2 uses Scala 2.11. You will need to use a compatible Scala version (2.11.x).

Notice that only Scala 2.11.x is supported.

Upvotes: 2

Related Questions