Reputation: 11
I tried to compile the https://github.com/apache/spark project using IntelliJ IDEA with the sbt plugin on Windows.
I'm facing an error about sbt. Since I'm not familiar with sbt, I don't know how to fix it.
The error messages are as follows:
[info] Loading project definition from F:\codeReading\sbtt\spark-master\project
[info] Compiling 3 Scala sources to F:\codeReading\sbtt\spark-master\project\target\scala-2.10\sbt-0.13\classes...
[error] F:\codeReading\sbtt\spark-master\project\SparkBuild.scala:26: object sbt is not a member of package com.typesafe
[error] import com.typesafe.sbt.pom.{PomBuild, SbtPomKeys}
[error] ^
[error] F:\codeReading\sbtt\spark-master\project\SparkBuild.scala:51: not found: type PomBuild
[error] object SparkBuild extends PomBuild {
[error] ^
[error] F:\codeReading\sbtt\spark-master\project\SparkBuild.scala:118: not found: value SbtPomKeys
[error] otherResolvers <<= SbtPomKeys.mvnLocalRepository(dotM2 => Seq(Resolver.file("dotM2", dotM2))),
[error] ^
[error] F:\codeReading\sbtt\spark-master\project\SparkBuild.scala:178: value projectDefinitions is not a member of AnyRef
[error] super.projectDefinitions(baseDirectory).map { x =>
[error] ^
[error] four errors found
[error] (plugins/compile:compile) Compilation failed
Upvotes: 1
Views: 2041
Reputation: 66886
Spark is built with Maven. The SBT build is only a convenience. You will have far better results importing it as a Maven project.
Upvotes: 2
Reputation: 74629
It looks like IDEA doesn't like project references to git projects that the Spark build definition uses for sbt-pom-reader.
It's showed up when I ran sbt
within the cloned project:
➜ spark git:(master) ✗ xsbt
[info] Loading global plugins from /Users/jacek/.sbt/0.13/plugins
[info] Loading project definition from /Users/jacek/oss/spark/project/project
[info] Loading project definition from /Users/jacek/.sbt/0.13/staging/ec3aa8f39111944cc5f2/sbt-pom-reader/project
[warn] Multiple resolvers having different access mechanism configured with same name 'sbt-plugin-releases'. To avoid conflict, Remove duplicate project resolvers (`resolvers`) or rename publishing resolver (`publishTo`).
[info] Loading project definition from /Users/jacek/oss/spark/project
[info] Set current project to spark-parent (in build file:/Users/jacek/oss/spark/)
>
You can see the project reference to the git project of sbt-pom-reader when accessing the plugins
project for which the reference is defined:
> reload plugins
[info] Loading global plugins from /Users/jacek/.sbt/0.13/plugins
[info] Loading project definition from /Users/jacek/oss/spark/project/project
[info] Updating {file:/Users/jacek/oss/spark/project/project/}project-build...
[info] Resolving org.fusesource.jansi#jansi;1.4 ...
[info] Done updating.
[info] Loading project definition from /Users/jacek/.sbt/0.13/staging/ec3aa8f39111944cc5f2/sbt-pom-reader/project
[warn] Multiple resolvers having different access mechanism configured with same name 'sbt-plugin-releases'. To avoid conflict, Remove duplicate project resolvers (`resolvers`) or rename publishing resolver (`publishTo`).
[info] Updating {file:/Users/jacek/.sbt/0.13/staging/ec3aa8f39111944cc5f2/sbt-pom-reader/project/}sbt-pom-reader-build...
[info] Resolving org.fusesource.jansi#jansi;1.4 ...
[info] Done updating.
[info] Loading project definition from /Users/jacek/oss/spark/project
> projects
[info] In file:/Users/jacek/oss/spark/project/
[info] * plugins
[info] spark-style
[info] In https://github.com/ScrapCodes/sbt-pom-reader.git
[info] sbt-pom-reader
A solution could be executing sbt gen-idea
to generate the project files for IDEA. It's just a guess, though.
Upvotes: 1