odonnry
odonnry

Reputation: 189

Spark job not showing up on standalone cluster GUI

I am playing with running spark jobs in my lab and have a three node standalone cluster. When I execute a new job on the master node via CLI spark-submit sparktest.py --master spark://myip:7077 while the job completes as expected it does not show up at all on the cluster GIU. After some investigation, I added the --master to the submit command but to no avail. During job execution as well as after completion when I navigate to http://mymasternodeip:8080/ none of these jobs are recognized in Running Jobs nor Completed Jobs. Any thoughts as to why the jobs dont show up would be appreciated.

Upvotes: 1

Views: 496

Answers (1)

Mohana B C
Mohana B C

Reputation: 5487

You should specify --master flag first then remaining flags/options. If not master will be considered as local.

spark-submit --master spark://myip:7077 sparktest.py

Make sure that you don't override master config in your code while creating SparkSession object. Provide same master url in code also or don't add it.

Upvotes: 1

Related Questions