ugur
ugur

Reputation: 410

Spark standalone cluster

I have an spark-standalone cluster. The cluster consists of 2 workers and 1 master nodes. When I run an program on master node, jobs are only assigned to one worker. Another worker can not do something. enter image description here

Workers appears on the picture. To run my code, I have used following command:

spark-submit --class Main.Main --master spark://172.19.0.2:7077 --deploy-mode cluster Main.jar ReadText.txt  

Upvotes: 1

Views: 300

Answers (2)

Sandeep Purohit
Sandeep Purohit

Reputation: 3702

Can you please try once with the deploy mode client or just ignore that parameter because what is happening here if your deploy mode will be cluster, one of your worker run the driver task and the other worker will run the rdd task so thats why your one worker only execute the task and when you run your shell it was by default use the client mode and use both the workers for running tasks. Just try once below command to deploy the application and can you please once also share code snippet of your application.

spark-submit --class Main.Main --master spark://172.19.0.2:7077  Main.jar ReadText.txt   

Upvotes: 0

Bhavesh
Bhavesh

Reputation: 919

From the above Image we notice you have 1 core system in your worker nodes

You can use the below command

spark-submit --class Main.Main --total-executor-cores 2 --executor-cores 1 --master spark://172.19.0.2:7077 --deploy-mode cluster Main.jar ReadText.txt

Hope this Helps!!!...

Upvotes: 0

Related Questions