Key Jun
Key Jun

Reputation: 450

Cannot connect to spark cluster on intellij but spark-submit can

My spark cluster includes 4 workers worked fine if I use spark-submmit command like this:

spark-submit --class org.apache.spark.examples.SparkPi   --master spark://220.149.84.24:7077   --deploy-mode cluster   --supervise   --executor-memory 2G   --total-executor-cores 100   examples/jars/spark-examples_2.11-2.4.5.jar    1000 

But if I try to run it on intellij, then I get this error:

20/06/12 15:51:23 INFO StandaloneAppClient$ClientEndpoint: Connecting to master spark://220.149.84.24:7077...
20/06/12 15:51:23 INFO TransportClientFactory: Successfully created connection to /220.149.84.24:7077 after 23 ms (0 ms spent in bootstraps)
20/06/12 15:51:43 INFO StandaloneAppClient$ClientEndpoint: Connecting to master spark://220.149.84.24:7077...
20/06/12 15:52:03 INFO StandaloneAppClient$ClientEndpoint: Connecting to master spark://220.149.84.24:7077...
20/06/12 15:52:23 ERROR StandaloneSchedulerBackend: Application has been killed. Reason: All masters are unresponsive! Giving up.

This is so weird right. Thats the same spark cluster address which is "spark://220.149.84.24:7077". Please help me with this error.

Here is the SparkContext configurations (I'm using spark 2.4.5):

// SparkContext
val conf: SparkConf = new SparkConf()
conf.setMaster("spark://220.149.84.24:7077") //
conf.setAppName("AirbnbRecommender") // app 이름
conf.set("spark.driver.bindAddress", "127.0.0.1") // driver ip
conf.set("spark.serializer", "org.apache.spark.serializer.KryoSerializer")
conf.set("spark.kryoserializer.buffer.max", "128m")
conf.set("spark.eventLog.enabled", "true")

val sc = new SparkContext(conf)

Upvotes: 1

Views: 549

Answers (1)

Aramis NSR
Aramis NSR

Reputation: 1847

Between all those technologies, I still wonder why spark needs to be run under spark-submit circumstances, you wont see this with Mongodb or Kafka, just spark!

to achieve this I advise you to use REST API providers like Apache Livy(Although I didn't like it as I tried to use it a year ago) or

try to make your server "GUI capable" with Xorg or something like that, log on to it, install intelij and submit your jobs in a local fashion, you can use your PC to test the scenarios as intelij can support local spark job runs and when you made sure you doing fine with your syntaxes and your algorithm, ship it to the repository of yours or copy and paste it into your server system and there work with it.

Good luck.

Upvotes: 1

Related Questions