Aliza
Aliza

Reputation: 744

running multiple Spark jobs on a Mesos cluster

I would like to run multiple spark jobs on my Mesos cluster, and have all spark jobs share the same spark framework. Is this possible? I have tried running the MesosClusterDispatcher and have the spark jobs connect to the dispatcher, but each spark job launches its own "Spark Framework" (I have tried running both client-mode and cluster-mode). Is this the expected behaviour? Is it possible to share the same spark-framework among multiple spark jobs?

Upvotes: 0

Views: 214

Answers (1)

gasparms
gasparms

Reputation: 3354

It is normal and it's the expected behaviour.

In Mesos as far as I know, SparkDispatcher is in charge of allocate resources for your Spark Driver which will act as a framework. Once Spark driver has been allocated, it is responsible for talk to Mesos and accept offers to allocate the executors where tasks will be executed.

Upvotes: 2

Related Questions