Reputation: 63082
The spark jars have been successfully published to the local repository:
sbt publish-local
Here is an excerpt for the spark-core - things look healthy:
[info] published spark-core_2.10 to C:\Users\s80035683.m2\repository\org\apache\spark\spark-core_2.10\1.1.0-SNAPSHOT\spark-core_2.10-1.1.0-SNAPSHOT-javadoc.jar [info] published spark-core_2.10 to C:\Users\s80035683.ivy2\local\org.apache.spark\spark-core_2.10\1.1.0-SNAPSHOT\poms\spark-core_2.10.pom [info] published spark-core_2.10 to C:\Users\s80035683.ivy2\local\org.apache.spark\spark-core_2.10\1.1.0-SNAPSHOT\jars\spark-core_2.10.jar [info] published spark-core_2.10 to C:\Users\s80035683.ivy2\local\org.apache.spark\spark-core_2.10\1.1.0-SNAPSHOT\srcs\spark-core_2.10-sources.jar [info] published spark-core_2.10 to C:\Users\s80035683.ivy2\local\org.apache.spark\spark-core_2.10\1.1.0-SNAPSHOT\docs\spark-core_2.10-javadoc.jar [info] published ivy to C:\Users\s80035683.ivy2\local\org.apache.spark\spark-core_2.10\1.1.0-SNAPSHOT\ivys\ivy.xml
In particular: here is one file in the .m2:
C:\Users\s80035683\.m2\repository\org\apache\spark\spark-core_2.10\1.1.0-SNAPSHOT>dir
Directory of C:\Users\s80035683\.m2\repository\org\apache\spark\spark-core_2.10\1.1.0-SNAPSHOT
06/26/2014 04:25 PM <DIR> .
06/26/2014 04:25 PM <DIR> ..
06/26/2014 04:25 PM 1,180,476 spark-core_2.10-1.1.0-SNAPSHOT-javadoc.jar
06/26/2014 04:24 PM 808,815 spark-core_2.10-1.1.0-SNAPSHOT-sources.jar
06/26/2014 02:27 PM 5,781,917 spark-core_2.10-1.1.0-SNAPSHOT.jar
06/26/2014 05:03 PM 13,436 spark-core_2.10-1.1.0-SNAPSHOT.pom
The problem comes when trying to consume the jars in a client project.
Here is an excerpt from the client build.sbt:
val sparkVersion = "1.1.0-SNAPSHOT"
..
libraryDependencies ++= Seq(
"org.apache.spark" % "spark-core_2.10" % sparkVersion % "compile->default" withSources(),
"org.apache.spark" % "spark-sql_2.10" % sparkVersion % "compile->default" withSources()
..
resolvers ++= Seq(
"Apache repo" at "https://repository.apache.org/content/repositories/releases",
"Local Repo" at Path.userHome.asFile.toURI.toURL + "/.m2/repository",
Resolver.mavenLocal
)
So: we have :
But when we do:
sbt package
We get unresolved dependency on the same spark artifacts that we just published:
[info] Loading project definition from C:\apps\hspark\project
[info] Set current project to hspark (in build file:/C:/apps/hspark/)
[info] Updating {file:/C:/apps/hspark/}hspark...
[info] Resolving org.scala-lang#scala-library;2.10.4 ...
[info] Resolving org.apache.spark#spark-core_2.10;1.1.0-SNAPSHOT ...
[info] Resolving org.apache.spark#spark-sql_2.10;1.1.0-SNAPSHOT ...
[info] Resolving org.scala-lang#scala-compiler;2.10.4 ...
[info] Resolving org.scala-lang#scala-reflect;2.10.4 ...
[info] Resolving org.scala-lang#jline;2.10.4 ...
[info] Resolving org.fusesource.jansi#jansi;1.4 ...
[warn] ::::::::::::::::::::::::::::::::::::::::::::::
[warn] :: UNRESOLVED DEPENDENCIES ::
[warn] ::::::::::::::::::::::::::::::::::::::::::::::
[warn] :: org.apache.spark#spark-core_2.10;1.1.0-SNAPSHOT: configuration not found in org.apache.spark#spark-core_2.10;1.1.0-SNAPSHOT: 'default'. It was required from default#hspark_2.10;0.1.0-SNAPSHOT compile
[warn] :: org.apache.spark#spark-sql_2.10;1.1.0-SNAPSHOT: configuration not found in org.apache.spark#spark-sql_2.10;1.1.0-SNAPSHOT: 'default'. It was required from default#hspark_2.10;0.1.0-SNAPSHOT compile
[warn] ::::::::::::::::::::::::::::::::::::::::::::::
sbt.ResolveException: unresolved dependency: org.apache.spark#spark-core_2.10;1.1.0-SNAPSHOT: configuration not found in org.apache.spark#spark-core_2.10;1.1.0-SNAPSHOT: 'default'. It was required from default#hspark_2.10;0.1.0-SNAPSHOT compile
unresolved dependency: org.apache.spark#spark-sql_2.10;1.1.0-SNAPSHOT: configuration not found in org.apache.spark#spark-sql_2.10;1.1.0-SNAPSHOT: 'default'. It was required from default#hspark_2.10;0.1.0-SNAPSHOT compile
at sbt.IvyActions$.sbt$IvyActions$$resolve(IvyActions.scala:217)
at sbt.IvyActions$$anonfun$update$1.apply(IvyActions.scala:126)
..
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:744)
[error] (*:update) sbt.ResolveException: unresolved dependency: org.apache.spark#spark-core_2.10;1.1.0-SNAPSHOT: configuration not found in org.apache.spark#spark-core_2.10;1.1.0-SNAPSHOT: 'default'. It was required from default#hspark_2.10;0.1.0-SNAPSHOT compile
[error] unresolved dependency: org.apache.spark#spark-sql_2.10;1.1.0-SNAPSHOT: configuration not found in org.apache.spark#spark-sql_2.10;1.1.0-SNAPSHOT: 'default'. It was required from default#hspark_2.10;0.1.0-SNAPSHOT compile
[
[error] (*:update) sbt.ResolveException: unresolved dependency: org.apache.spark#spark-core_2.10;1.1.0-SNAPSHOT: configuration not found in org.apache.spark#spark-core_2.10;1.1.0-SNAPSHOT: 'default'. It was required from default#hspark_2.10;0.1.0-SNAPSHO
UPDATE Based on answer by @lpiepiora it seems removing the compile-> default does (surprisingly) make the difference. Here is the evidence so far.
(Using the dependency-graph plugin):
Done updating. [info] default:hspark_2.10:0.1.0-SNAPSHOT [S] [info]
+-org.apache.spark:spark-core_2.10:1.1.0-SNAPSHOT [S]
Upvotes: 3
Views: 3856
Reputation: 13749
Try removing mapping compile->default
for your dependencies. It is redundant anyway, as the documentation says:
A configuration without a mapping (no "->") is mapped to "default" or "compile". The -> is only needed when mapping to a different configuration than those.
Therefore declare your dependencies as follows:
libraryDependencies ++= Seq(
"org.apache.spark" % "spark-core_2.10" % sparkVersion withSources(),
"org.apache.spark" % "spark-sql_2.10" % sparkVersion withSources()
)
and they should resolve.
Upvotes: 3