Michael
Michael

Reputation: 1428

How to run multiple concurrent jobs in Spark using python multiprocessing

I have setup a Spark on YARN cluster on my laptop, and have problem running multiple concurrent jobs in Spark, using python multiprocessing. I am running on yarn-client mode. I tried two ways to achieve this:

yarn will only run one job at a time, using 3 containers, 3 vcores, 3GB ram. So there are ample vcores and rams available for the other jobs, but they are not running

Upvotes: 2

Views: 7909

Answers (3)

patrick
patrick

Reputation: 1

I meet the same question as you, and I solved it with setting .config("spark.executor.cores", '1') in pyspark. here is my code :

import os,sys
import numpy as np
import pyspark
from multiprocessing import Pool
from pyspark.sql import SparkSession
import time
def train(db):

    print(db)
    spark = SparkSession \
        .builder \
        .appName("scene_"+str(db)) \
        .config("spark.executor.cores", '1') \
        .getOrCreate()
    print(spark.createDataFrame([[1.0],[2.0]],['test_column']).collect())

if __name__ == '__main__':
    p = Pool(10)
    for db in range(10):
        p.apply_async(train,args=(db,))    
    p.close()
    p.join()
    #train(4)

Upvotes: 0

Michael
Michael

Reputation: 1428

I found the solution https://stackoverflow.com/a/33012538/957352

For single machine cluster,

In the file

/etc/hadoop/conf/capacity-scheduler.xml

changed the property

yarn.scheduler.capacity.maximum-am-resource-percent from 0.1 to 0.5.

Upvotes: 0

MrE
MrE

Reputation: 20768

How many CPUs do you have and how many are required per job? YARN will schedule the jobs and assign what it can on your cluster: if you require 8CPUs for your job and your system has only 8CPUs, then other jobs will be queued and ran serially.

If you requested 4 per job then you would see 2 jobs run in parallel at any one time.

Upvotes: 1

Related Questions