shengshan zhang
shengshan zhang

Reputation: 538

How to set dependency scope in sbt according to profile?

How to set dependency in build.sbt according to different envs. For example:

libraryDependencies ++= "org.apache.spark" %% "spark-core" % sparkVersion % "compile", // expected  in dev
libraryDependencies ++= "org.apache.spark" %% "spark-core" % sparkVersion % "provided", // expected in prod

Any suggestions?

Upvotes: 2

Views: 829

Answers (2)

Izhar Ahmed
Izhar Ahmed

Reputation: 195

You could set the environment identifier in system properties and use scala match in build.sbt to get desired results.

Your build.sbt should look like this :-

val mode = sys.env.getOrElse("EXEC_MODE", "dev") // can be hardcoded.
val devSparkVersion = 2.0.2
val prodSparkVersion = 1.6.2

mode match {
  case "dev"  => libraryDependencies += "org.apache.spark" %% "spark-core" % devSparkVersion
  case "prod" => libraryDependencies += "org.apache.spark" %% "spark-core" % prodSparkVersion
}

Upvotes: 4

F. Lins
F. Lins

Reputation: 634

I've never tried this but according to this documentation:

http://www.scala-sbt.org/1.0/docs/Configuring-Scala.html

It looks like if you set

autoScalaLibrary := false

Then you can either use "test", "compile" or "runtime"

Upvotes: 1

Related Questions