deepblue
deepblue

Reputation: 8596

"./sbt/sbt assembly" errors "Not a valid command: assembly" for Apache Spark project

I'm having trouble with installing Apache Spark on Ubuntu 13.04. Im using spark-0.8.1-incubating, and both ./sbt/sbt update and ./sbt/sbt compile work fine. However, when I do a ./sbt/sbt assembly I get the following error:

[info] Set current project to default-289e76 (in build  file:/node-insights/server/lib/spark-0.8.1-incubating/sbt/)   
[error] Not a valid command: assembly   
[error] Not a valid project ID: assembly   
[error] Not a valid configuration: assembly   
[error] Not a valid key: assembly   
[error] assembly   
[error]            

I googled for stuff related to this but couldn't find anything useful. Any guidance would be much appreciated.

Upvotes: 3

Views: 10284

Answers (2)

Josh Rosen
Josh Rosen

Reputation: 13821

The current project set to default-289e76 message suggests that sbt was called from outside of the Spark sources directory:

$  /tmp  ./spark-0.8.1-incubating/sbt/sbt assembly
[info] Loading global plugins from /Users/joshrosen/.dotfiles/.sbt/plugins/project
[info] Loading global plugins from /Users/joshrosen/.dotfiles/.sbt/plugins
[info] Set current project to default-d0f036 (in build file:/private/tmp/)
[error] Not a valid command: assembly
[error] Not a valid project ID: assembly
[error] Not a valid configuration: assembly
[error] Not a valid key: assembly
[error] assembly
[error]         ^

Running ./sbt/sbt assembly works fine from the spark-0.8.1-incubating directory (note the log output showing that the current project was set correctly):

$  spark-0.8.1-incubating  sbt/sbt assembly
[info] Loading global plugins from /Users/joshrosen/.dotfiles/.sbt/plugins/project
[info] Loading global plugins from /Users/joshrosen/.dotfiles/.sbt/plugins
[info] Loading project definition from /private/tmp/spark-0.8.1-incubating/project/project
[info] Loading project definition from /private/tmp/spark-0.8.1-incubating/project
[info] Set current project to root (in build file:/private/tmp/spark-0.8.1-incubating/)
...

Upvotes: 6

swartzrock
swartzrock

Reputation: 729

You typed "abt" twice, but shouldn't that be "sbt"? Apache Spark has its own copy of sbt, so make sure you're running Spark's version to pick up the "assembly" plugin among other customizations.

To run the Spark installation of sbt, go to the Spark directory and run ./sbt/sbt assembly .

Upvotes: 4

Related Questions