Make42
Make42

Reputation: 13088

"unresolved dependency" for Spark 2.0.1 on SBT

With my build.sbt

version := "1.0"
scalaVersion := "2.11.8"

// https://mvnrepository.com/artifact/org.apache.spark/spark-core_2.11
libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.0.1"

I try to get Spark into my sbt 0.13 project, but IntelliJ 2016.2.5 gives the error "unresolved dependency". What am I doing wrongly?

There is no proxy and it works with if I say "2.0.0" instead of "2.0.1", but that is not so nice...

Also: it seems to work in the CLI, but not in IntelliJ.

Upvotes: 4

Views: 2851

Answers (3)

rhoeting
rhoeting

Reputation: 396

The answer from Mateusz Kubuszok is close. I simply add http://central.maven.org/maven2 to my resolvers, rather than https://mvnrepository.com/.

resolvers  += "MavenRepository" at "http://central.maven.org/maven2"

This seems to have many of the spark dependencies.

Upvotes: 0

user1068378
user1068378

Reputation: 319

i am currently using this build.sbt to get spark2 depdendencies but it seems i am not getting any as code fails with exception ss uch as

C:\Users\marco\SparkExamples\src\main\scala\MachineLearningExamples.scala:3: object mllib is not a member of package org.apache.spark [error] import org.apache.spark.mllib.regression.LabeledPoint [error] ^ [error] C:\Users\marco\SparkExamples\src\main\scala\MachineLearningExamples.scala:4: object classification is not a member of package org.apache.spark.ml [error] import org.apache.spark.ml.classification._ [error] ^

here's my build.sbt.... wondering if anyone can try it to exclude sbt issues on my local machine. thanks

    name := "SparkExamples"
    version := "1.0"
    scalaVersion := "2.11.8"
    val sparkVersion = "2.0.1"


    libraryDependencies += "junit" % "junit" % "4.8" % "test"
    libraryDependencies ++= Seq("org.slf4j" % "slf4j-api" % "1.7.5",
                        "org.slf4j" % "slf4j-simple" % "1.7.5",
                        "org.clapper" %% "grizzled-slf4j" % "1.0.2")
    libraryDependencies += "org.apache.spark"%%"spark-core"   % sparkVersion 
    libraryDependencies += "org.apache.spark"%%"spark-streaming"   % sparkVersion 
    libraryDependencies += "org.apache.spark"%%"spark-mllib"   % sparkVersion 
    libraryDependencies += "org.apache.spark"%%"spark-streaming-flume-sink" % sparkVersion     
    libraryDependencies += "org.apache.spark"%%"spark-sql"   % sparkVersion 


    resolvers += "MavenRepository" at "https://mvnrepository.com/"

If i use sparkVersion 1.6.0 (with a minor tweak to the spark-streaming-flume dependency) everything works fie

Upvotes: 0

Mateusz Kubuszok
Mateusz Kubuszok

Reputation: 27535

Out of the box SBT loads only https://repo1.maven.org/maven2/ repository which as far as I can tell currently has no Apache Spark. Maybe another project you builded fetched it from other repo and now it is resolved using you local Ivy cache?

You can solve the issue by adding another maven reporotory to your project like:

resolvers ++= Seq(
  Resolver sonatypeRepo "public",
  Resolver typesafeRepo "releases",
)

UPDATE: If you want to use MavenRepository (you are not using it out of the box) you can try adding:

resolvers += "MavenRepository" at "https://mvnrepository.com/"

Upvotes: 1

Related Questions