Reputation: 1849
I am using ./bin/spark-submit
to run my Spark job. It runs fine but opening Spark web UI, I cannot see job in completed list.
./bin/spark-submit --name "myapp" --master local --conf "spark.master=spark://fahad:7077" --class com.apptest.App ~/app-0.0.1-SNAPSHOT.jar
Note: Spark version 2.0.1, 1 worker running, master UI at localhost:8080
both worker and master ran from ./sbin/start-*.sh
scripts.
Upvotes: 3
Views: 4273
Reputation: 21810
There are two different UI's, the regular Spark UI, and the Spark History Server.
The one that shows jobs after they complete is the history server.
http://spark.apache.org/docs/latest/monitoring.html
They explain in the docs that you need to start it by running:
./sbin/start-history-server.sh
This creates a web interface at http://server-url:18080 by default, listing incomplete and completed applications and attempts.
When using the file-system provider class (see spark.history.provider below), the base logging directory must be supplied in the spark.history.fs.logDirectory configuration option, and should contain sub-directories that each represents an application’s event logs.
The spark jobs themselves must be configured to log events, and to log them to the same shared, writeable directory. For example, if the server was configured with a log directory of hdfs://namenode/shared/spark-logs, then the client-side options would be:
spark.eventLog.enabled true spark.eventLog.dir hdfs://namenode/shared/spark-logs
Upvotes: 5