Reputation: 5139
the problem is similar to Passing command line arguments to Spark-shell. However, I didn't get the answer I want, so I rephrase my problem below.
I want to run a scala script in spark-shell with following commands:
spark-shell -i file.scale
It works well without any program argument. however, if I want to add in some command line argument for the file.scale. referencing to the way it does in scala shell http://alvinalexander.com/scala/scala-shell-script-command-line-arguments-args, I tried with following way:
spark-shell -i file.scale args1 args2
and I tried to retrieve the arguments but failed like followings
var input = args(0)
var output = args(1)
The error message shows parameter args
is not recognisable.
Any one knows how to do that?
Upvotes: 1
Views: 6677
Reputation: 3956
There is difference between Scala and Spark-shell. Spark-shell is wrapper to scala and runs in distributed mode, hence parameter passing does not work in the same manner as scala.
To run scala application, here are the steps you need to follow
As the application has to be run in distributed mode in Spark, you will not be able to pass the parameters to scala scripts using spark-shell.
Upvotes: 3