Reputation: 8298
I came across this detailed explanation on how to setup build.sbt
for Spark.
But then I read about the SBT plugin for Spark packages where apparently a single line like
https://github.com/databricks/sbt-spark-package
is OK as the plugin is doing the nasty work.
Is this understanding correct?
Upvotes: 1
Views: 233
Reputation: 4600
I would say yes. If you take a look at a simple .sbt file used for Spark (e.g., https://github.com/databricks/learning-spark/blob/master/build.sbt), you can see that you'll need to include a bunch of spark dependencies. (ok, not all the ones listed there...).
In addition, if you look at what the plugin does, you will find some more utility functions: https://github.com/databricks/sbt-spark-package/blob/master/src/main/scala/sbtsparkpackage/SparkPackagePlugin.scala
It's a cool plugin!
Upvotes: 1