Balaji Reddy
Balaji Reddy

Reputation: 5710

Is FAIR available for Spark Standalone cluster mode?

I'm having 2 node cluster with spark standalone cluster manager. I'm triggering more than one job using same sc with Scala multi threading.What I found is my jobs are scheduled one after another because of FIFO nature so I tried to use FAIR scheduling

    conf.set("spark.scheduler.mode", "FAIR")
    conf.set("spark.scheduler.allocation.file", sys.env("SPARK_HOME") + "/conf/fairscheduler.xml")

    val job1 = Future {
    val job = new Job1()
    job.run()
    }

    val job2 =Future {
    val job = new Job2()
    job.run()
    }


    class Job1{
        def run()
            sc.setLocalProperty("spark.scheduler.pool", "mypool1")
        }
        }

    class Job2{
        def run()
            sc.setLocalProperty("spark.scheduler.pool", "mypool2")
        }
        }



 <pool name="mypool1">
 <schedulingMode>FAIR</schedulingMode>
 <weight>1</weight>
 <minShare>2</minShare>    
  </pool>

 <pool name="mypool2">
 <schedulingMode>FAIR</schedulingMode>
 <weight>1</weight>
 <minShare>2</minShare>    
  </pool>

Job1 and Job2 will be triggered from an launcher class. Even after setting these properties, my jobs are handled in FIFO. Is FAIR available for Spark Standalone cluster mode?Is there a page where it's described in more details? I can't seem to find much about FAIR and Standalone in Job Scheduling.I'm following this SOF question.am I missing anything here ?

Upvotes: 3

Views: 1009

Answers (1)

I don't think standalone is the problem. You described creating only one pool, so I think your problem is that you need at least one more pool and assign each job to a different pool.

FAIR scheduling is done across pools, anything within the same pool will run in FIFO mode anyway.

This is based on the documentation here: https://spark.apache.org/docs/latest/job-scheduling.html#default-behavior-of-pools

Upvotes: 3

Related Questions