Reputation: 2328
I want to build subproject in Spark with sbt. I found this example and it works
$ ./build/sbt -Phive -Phive-thriftserver (build)
sbt (spark)> project hive (switch to subproject)
sbt (hive)> testOnly *.HiveQuerySuite -- -t foo ( run test case)
However, I tried the following but it does not build but quit
./build/sbt -mllib
I do not know how does the author figure out -Phive -Phive-thriftserver
. I cannot find this in Spark source code.
I just want to do the exact same thing as the example but with a different subproject.
This is not asking how to use projects
to print out all available projects.
Upvotes: 0
Views: 216
Reputation: 533
Specify the project scope:
./build/sbt mllib/compile
refer to: http://www.scala-sbt.org/0.13/docs/Scopes.html
Upvotes: 1