Satish Srinivas
Satish Srinivas

Reputation: 153

In build.sbt, dependencies in parent project not reflected in child modules

I am using SBT 1.8.0 for my spark scala project in intellij idea 2017.1.6 ide. I want to create a parent project and also its children project modules. So far this is what I have in my build.sbt:

lazy val parent = Project("spark-etl-parent",file("."))
.settings(
name := "spark-etl-parent_1.0",
scalaVersion := "2.11.1",
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-streaming" % sparkVersion % "provided" 
"org.apache.spark" %% "spark-hive" % sparkVersion % "provided")
)

lazy val etl = Project("spark-etl-etl",file("etl"))
.dependsOn(parent)
.settings(
name := "spark-etl-etl_1.0",
version := "1.0",
scalaVersion := "2.11.1"
)

lazy val redshiftBasin = Project("spark-etl- 
redshiftBasin",file("redshiftBasin"))
.dependsOn(parent)
.settings(
name := "spark-etl-redshiftBasin_1.0",
version := "1.0",
scalaVersion := "2.11.1"
 )


lazy val s3Basin = Project("spark-etl-s3Basin",file("s3Basin"))
.dependsOn(parent)
.settings(
name := "spark-etl-s3Basin_1.0",
version := "1.0",
scalaVersion := "2.11.1"
)

Now I am able to import any class from spark-streaming or spark-hive library dependencies in the parent module but not able to import and use them in any of the child modules. Only if I explicitly specify them as a library dependency in the any child module, I am able to use them.

  1. I am looking for something similar to dependencies tag in pom.xml with Maven build.
  2. Will it make a difference if I use separate build.sbt for each of the child modules?
  3. Also if I do .aggregate(etl) in parent config, it shows error as etl is declared later. But if I define etl before parent I am not able to do .dependsOn(parent) in etl config.

Please help me with a solution to fix these.

Upvotes: 4

Views: 1617

Answers (2)

Guy
Guy

Reputation: 164

Using provided->provided in the dependsOn helped me solve a similar problem:

So something like:

lazy val etl = Project("spark-etl-etl",file("etl"))
.dependsOn(parent % "compile->compile;test->test;provided->provided")
.settings(
name := "spark-etl-etl_1.0",
version := "1.0",
scalaVersion := "2.11.1"
)

Upvotes: 1

pme
pme

Reputation: 14803

My multi-module project uses the parent project only for building everything and delegate run to the 'server' project:

lazy val petstoreRoot = project.in(file(".")).
  aggregate(sharedJvm, sharedJs, server, client)
  .settings(organizationSettings)
  .settings(
    publish := {}
    , publishLocal := {}
    , publishArtifact := false
    , isSnapshot := true
    , run := {
      (run in server in Compile).evaluated
    }
  )

The settings (e.g. dependencies) I grouped in another file, e.g.:

  lazy val sharedDependencies: Seq[Def.Setting[_]] = Def.settings(libraryDependencies ++= Seq(
    "org.julienrf" %%% "play-json-derived-codecs" % "4.0.0"
   ...
    , "org.scalatest" %%% "scalatest" % scalaTestV % Test

  ))

Now each sub-module just adds whatever is needed, e.g.:

lazy val server = (project in file("server"))
  .settings(scalaJSProjects := Seq(client))
  .settings(sharedSettings(Some("server"))) // shared dependencies used by all
  .settings(serverSettings)
  .settings(serverDependencies)
  .settings(jvmSettings)
  .enablePlugins(PlayScala, BuildInfoPlugin)
  .dependsOn(sharedJvm)

The whole project you find here: https://github.com/pme123/scala-adapters

See the project/Settings file for the dependencies.

Upvotes: 2

Related Questions