Reputation: 147
I have a question regarding Apache Spark running on YARN in cluster mode. According to this thread, Spark itself does not have to be installed on every (worker) node in the cluster. My problem is with the Spark Executors: In general, YARN or rather the Resource Manager is supposed to decide about resource allocation. Hence, Spark Executors could be launched randomly on any (worker) node in the cluster. But then, how can Spark Executors be launched by YARN if Spark is not installed on any (worker) node?
Upvotes: 2
Views: 1239
Reputation: 35404
In a high level, When Spark application launched on YARN,
Spark driver will pass serialized actions(code) to executors to process data.
spark-assembly provides spark related jars to run Spark jobs on a YARN cluster and application will have its own functional related jars.
Edit: (2017-01-04)
Spark 2.0 no longer requires a fat assembly jar for production deployment.source
Upvotes: 2