chen amos
chen amos

Reputation: 111

The resources applied by YARN are not active, Will other jobs use them?

I run a spark job allocat a lot of resource in yarn, and the job is for a long time.

In the last the task of the spark job is just active one core an two core.

I want to know the don't active resource be can use by other spark job or mr job.

Or just the first spark job completed, the resource just can be use by other job.

Upvotes: 0

Views: 168

Answers (1)

ankush1377
ankush1377

Reputation: 208

Depends on your Queueing policy & the Scheduler specified for each queue.

I assume you just have a single default queue (root) in which all your jobs are being submitted. In that case, the default scheduler is a FIFO scheduler, which will submit a new job only once the earlier submitted job completes.

If that's not the case, you can check your queue & specified scheduler in etc/hadoop/capacity-scheduler.xml file.

More information on the 2 types of schedulers

https://hadoop.apache.org/docs/current/hadoop-yarn/hadoop-yarn-site/CapacityScheduler.html https://hadoop.apache.org/docs/current/hadoop-yarn/hadoop-yarn-site/FairScheduler.html

Upvotes: 1

Related Questions