VB_
VB_

Reputation: 45732

Spark Job Server multithreading and dynamic allocation

I had pretty big expectations from Spark Job Server, but found out it critically lack of documentation.

Could you please answer one/all of next questions:

  1. Does Spark Job Server submit jobs through Spark session?
  2. Is it possible to run few jobs in parallel with Spark Job Server? I saw people faced some troubles, I haven't seen solution yet.
  3. Is it possible to run few jobs in parallel with different CPU, cores, executors configs?

Upvotes: 0

Views: 346

Answers (1)

noorul
noorul

Reputation: 1353

  1. Spark jobserver do not support SparkSession yet. We will be working on it.
  2. Either you can create multiple contexts or you could run a context to use FAIR scheduler.
  3. Use different contexts with different resource config.

Basically job server is just a rest API for creating spark contexts. So you should be able to do what you could do with spark context.

Upvotes: 1

Related Questions