Ali AzG
Ali AzG

Reputation: 1983

how to run a spark scala program in a linux terminal?

I wrote a spark program in scala. now I want to run the script I wrote, in terminal. in pyspark I use spark-submit for a python file. now I want to do the same for my scala program. I do not want to use Intellij or write my program in spark-shell. I just want to write my code in an editor and run it by using a command in terminal. is that possible? Thank you in advance

Upvotes: 1

Views: 2418

Answers (1)

Praveen L
Praveen L

Reputation: 987

Create a JAR file for your code (jar name is HelloWorld.jar) say . You can use HDFS or local path like below examples.

You can add lot options in below commands which you can found in the url given by philantrovert in comments.

Run in Local mode.

spark-submit --class path.to.YourMainClass  --master local[*] /path/to/your/jar/HelloWorld.jar

Run in cluster mode.

spark-submit --deploy-mode cluster --class path.to.YourMainClass  --master yarn hdfs://nameservice1/hdfsPath/to/your/jar/HelloWorld.jar

Upvotes: 2

Related Questions