Alexander Lopatin
Alexander Lopatin

Reputation: 582

Can I assembly one subproject from its directory with SBT?

I defined multi-project SBT project. I declared all dependencies in my root project. When I run sbt assembly from root directory everything is okay.

How can I run sbt assembly from subproject directory? When I try to do this, SBT can't find dependencies that are declared in root build.sbt file.

For example, i do something like this in root build.sbt:

ThisBuild / organization := "org.example"
ThisBuild / version := "1.0"
ThisBuild / scalaVersion := "2.11.12"
...
lazy val commonSettings = Seq(
  libraryDependecies ++= Seq(
    "org.apache.spark" %% "spark-core" % sparkVersion
   , "org.apache.spark" %% "spark-hive" % sparkVersion
   , "org.apache.spark" %% "spark-sql" % sparkVersion
   // other dependencies
  ).map(_% Provided) ++ Seq(
    "org.postgresl" % "postgresql" % "42.2.24"
    // other dependencies
  )
)

lazy val root = (project in file("."))
  .aggregate(subproject)
  .settings(
    name := "root"
  )

lazy val subproject = (project in file("subproject"))
  .setting(
    commonSettings,
    name := "subproject"
    //...Other settings
  )

val allProjects = ScopeFilter(
  inProject(
    subproject
  )
)

build.sbt from subproject directory:

assembly / mainClass := Some("org.example.Main")
//other settings

When I run sbt assembly from root directory everything okay. When I run it from subproject directory i get errors like that:

object apache is not a member of package org

import org.apache.spark.sql.expressions.UserDefinedFunction

Is it possible to compile jar files from subprojects directories?

Upvotes: 2

Views: 286

Answers (1)

Mateusz Kubuszok
Mateusz Kubuszok

Reputation: 27595

You have to build from the directory in which main project is defined.

However you don't have to always build everything. You can pretty much do:

# in project root directory
sbt "subproject / assembly"

so there isn't even an issue.

Upvotes: 4

Related Questions