user2896120
user2896120

Reputation: 3282

PyRFC won't allow multiprocessing?

I'm using PyRFC on Databricks and I'm trying to do around 20k queries to SAP. Due to the large volume of queries, I want to do this by utilizing multiprocessing. Here's what I have:

ASHOST='Some_Server_Name'
CLIENT='xx'
SYSNR='xx'
USER='xxxx'
PASSWD='xxxx'
conn = Connection(ashost=ASHOST, sysnr=SYSNR, client=CLIENT, user=USER, passwd=PASSWD)


q = Queue()

def worker(plant, material):
    print("called")
    q.put(conn.call(
        "Func_Name",
        **{"WERKS": plant, "MATNR": material.zfill(18)}
    ))

jobs = []
for i in plant_collection[:10]:
    p = Process(target=worker, args=(i["plant"], i["material"]))
    jobs.append(p)

for j in jobs:
    j.start()

print(q.get())

I'm using the multiprocessing library. The print function never gets executed and when I call q.get() it never returns back. It just hangs. What am I doing wrong and how can I fix this?

Upvotes: 0

Views: 156

Answers (0)

Related Questions