user14339195
user14339195

Reputation:

Spark UI for Pyspark

How to access Spark UI for pyspark jobs? I am trying to login to (localhost:4040) for track my jobs but not load although when open spark- shell, but when open pyspark it is not login

Upvotes: 0

Views: 1552

Answers (3)

Raphael Roth
Raphael Roth

Reputation: 27383

use sparkContext.uiWebUrl to get the URL, where sparkContext is an instance of SparkContext

Upvotes: 0

badger
badger

Reputation: 3256

Spark UI provides a realtime view for your spark job and if your job terminates you lose that view in order to preserve that view you have to add a blocking code at the end of your Spark job like input()and as Relic16 said Spark starts from port 4040 and if it was occupied it tries port 4041 and so on. also if you look at logs carefully Spark mentions the ip and port in the logs

Upvotes: 1

Relic16
Relic16

Reputation: 332

Spark UI available only for the time when your spark session is present. Also spark looks for ports starting from 4040 and iterates if it cannot use that port. If you are starting spark shell, it will mention in the beginning, the port it is using for spark UI.

Upvotes: 1

Related Questions