JeyJ
JeyJ

Reputation: 4080

error when trying to import spark with sbt

I downloaded play framwork project - the simple one. I'm trying to import spark version 2.2.0 via sbt but I'm getting the next error :

sbt.librarymanagement.ResolveException: unresolved dependency: org.apache.spark#spark-core_2.12;2.2.0: not found

The build.sbt file :

name := """play-scala-starter-example"""
version := "1.0-SNAPSHOT"
lazy val root = (project in file(".")).enablePlugins(PlayScala)
resolvers += Resolver.sonatypeRepo("snapshots")
scalaVersion := "2.11.5"
libraryDependencies += guice
libraryDependencies += "org.scalatestplus.play" %% "scalatestplus-play" % "3.1.2" % Test
libraryDependencies += "com.h2database" % "h2" % "1.4.196"
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.2.0"

All the lines in the build.sbt files are marked in red and have the same error :

expression type must conform to setting in sbt file

The plugin.sbt file :

// The Play plugin
 addSbtPlugin("com.typesafe.play" % "sbt-plugin" % "2.6.13")

To clarify, I have two problems :

  1. "expression type must conform to setting in" in the build.sbt file for every row.
  2. Can't import the spark libraries via sbt

Upvotes: 0

Views: 1532

Answers (1)

Mahesh Chand
Mahesh Chand

Reputation: 3250

Spark 2.2.0 is built and distributed to work with Scala 2.11 by default. To write applications in Scala, you will need to use a compatible Scala version (e.g. 2.11.X). And your scala version is 2.12.X. That's why it is throwing exception.

Upvotes: 1

Related Questions