David
David

Reputation: 213

Is it possible to combine jobs in the Spark UI?

We are running an optimization routine in Spark that makes many (e.g. 100s) of gradient calls. Each gradient evaluation is listed as its own Spark job in the UI. Is it possible to somehow collapse all these calls into a single job?

Upvotes: 1

Views: 73

Answers (1)

Guy Melul
Guy Melul

Reputation: 159

It is currently not possible. What you can do is use the REST API and make your own Web UI tailored to your needs. Use the driver's REST API while the app is running, but once the application finished you will need to switch to the history server REST API.

You would probably also want to increase the number of retained jobs if you run many iterations

spark.ui.retainedJobs (default: 1000)

Spark REST API docs: https://spark.apache.org/docs/latest/monitoring.html#rest-api

Spark Web UI configuration reference: https://spark.apache.org/docs/latest/configuration.html#spark-ui

Edit: There might be a reverse proxy for YARN so you can always make the calls to the same address but it would probably cause a REDIRECT which would slow down the response time

Upvotes: 1

Related Questions