K P
K P

Reputation: 851

Spark driver unable to start in cluster

I am submitting jobs to Spark Cluster using SparkLauncher ~ which will start Spark driver on one of the worker nodes. But driver startup always fails with this exception appears 16 times:

level="WARN",threadName="main",logger="org.apache.spark.util.Utils",message="Service 'Driver' could not bind on port 0. Attempting port 1."

Does anyone have any ideas?

Upvotes: 1

Views: 1605

Answers (2)

K P
K P

Reputation: 851

I finally figured it out. If you set environment variable SPARK_LOCAL_IP = 0.0.0.0 from the machine where you're launching the job from, it seems to fix it. As per documentation, it'll choose a random port for driver to run on any worker node in the cluster. As per comment by @yyny, if you wish to fix the port of driver, you can use "spark.driver.port".

Upvotes: 3

tesnik03
tesnik03

Reputation: 1359

he port number looks incorrect, you can change it through spark.driver.port

Upvotes: 0

Related Questions