Abu Tahir
Abu Tahir

Reputation: 382

Dependency not resolved

I cannot compile the scala code , a dependency error is popping up!

but i have another sample application which works fine with existing setup,this code is not working at the same arch

:error logs

 :sbt compile
[info] Set current project to firstScalaScript (in build file:/home/abu/Current%20Workspace/)
[info] Updating {file:/home/abu/Current%20Workspace/}current-workspace...
[info] Resolving org.apache.spark#spark-core;2.0.1 ...
[warn]  module not found: org.apache.spark#spark-core;2.0.1
[warn] ==== local: tried
[warn]   /home/abu/.ivy2/local/org.apache.spark/spark-core/2.0.1/ivys/ivy.xml
[warn] ==== public: tried
[warn]   https://repo1.maven.org/maven2/org/apache/spark/spark-core/2.0.1/spark-core-2.0.1.pom
[warn] ==== Akka Repository: tried
[warn]   http://repo.akka.io/releases/org/apache/spark/spark-core/2.0.1/spark-core-2.0.1.pom
[info] Resolving org.fusesource.jansi#jansi;1.4 ...
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[warn]  ::          UNRESOLVED DEPENDENCIES         ::
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[warn]  :: org.apache.spark#spark-core;2.0.1: not found
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[warn] 
[warn]  Note: Unresolved dependencies path:

my code:

import scala.io.Source._ 
import org.apache.spark.SparkContext 
import org.apache.spark.SparkContext._ 
import org.apache.spark.SparkConf 
import org.apache.log4j.Logger 
import org.apache.log4j.Level 
import org.apache.spark.rdd.RDD 
import org.apache.hadoop.io.compress.GzipCodec 

object firstScalaScript{ 
    def main(args: Array[String])
    { 
        val sc=new SparkContext(new SparkConf()) 
        val rdd=sc.textFile("e.txt,r.txt").collect()  
        //rdd.saveAsTextFile("confirmshot.txt"); sc.stop() 
    } 
}

Upvotes: 0

Views: 700

Answers (2)

T. Gawęda
T. Gawęda

Reputation: 16086

Spark dependecies has additional number in artifactId - Scala version

It should be spark-core_2.11 for example, for Scala 2.11

In SBT it should be:

// Scala version will be added by SBT while building
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.0.2"

or:

libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.0.2"

Note: second version should not be used with libraries which has Scala dependecy, because first one will automatically choose proper artifact. Use it only with non-Scala dependencies

Upvotes: 0

Stefano Bonetti
Stefano Bonetti

Reputation: 9023

Spark artifacts (and those of many more libraries) are packaged and distributed for different versions of Scala. To distinguish between them, the Scala version is appended at the end of the artifact name, e.g. spark-core_2.10 or spark-core_2.11.

Your spark-core dependency is incomplete, as it's missing the Scala version.

SBT can help you appending the Scala version you're using to the artifact name at build time. You can add dependency as

"org.apache.spark" %% "spark-core" % "2.0.1"

and that will translate to

"org.apache.spark" % "spark-core_YOUR_SCALA_VERSION" % "2.0.1"

All the details about this Jar can be found in Maven. Note that in this page you can find suggestions on how to import the lib using other tools like Maven or Gradle.

Upvotes: 1

Related Questions