Reputation: 742
I am wondering if it is possible to pass parameters in a Scala Spark program using the context or something similar. I mean, I read some parameters from spark-submit inside my app, but those parameters will be necessary "at the end"(let's say). So I have to pass them from the driver to another file, and then to another file and so on... So, my call to a method have a huge list of parameters. Thank you in advance!
Upvotes: 0
Views: 327
Reputation: 1186
The key point to understand is, you provide spark submit the application jar file and any command line parameters that you wishes spark submit provide while invoking the jar.
My understanding is, you only need some of those parameters at the very end of execution and you do not carry all those arguments in nested function calls. I will say, there is definite scope of refactoring the design.
Anycase, one trick you can employ is, write those parameters to a json file and make it available to be read by your spark application when necessary(I would write those parameter to aws s3 and read them when needed).
Or, you can create an implicit variable and carry it through out the code which I believe will not be a good design.
Upvotes: 3