Reputation: 19617
I'm trying to use the multiprocessing
module and more partuclarly the Pool.apply_async()
function.
This code works well:
import multiprocessing
def do():
print("Foobar", flush=True)
with multiprocessing.Pool(1) as pool:
for i in range(2):
pool.apply_async(do)
pool.close()
pool.join()
The "Foobar"
string is printed twice.
However, if I put this code in a function and then call this function, nothing happens. No error nor "Foobar"
, the program ends silently.
import multiprocessing
def test():
def do():
print("Foobar", flush=True)
with multiprocessing.Pool(1) as pool:
for i in range(5):
pool.apply_async(do)
pool.close()
pool.join()
test()
Why that? I'm using Python 3.7.3 on Linux.
Upvotes: 2
Views: 3573
Reputation: 15040
In order to retrieve your computation results do the following change to your code.
import multiprocessing
def test():
def do():
print("Foobar", flush=True)
with multiprocessing.Pool(1) as pool:
for i in range(5):
result = pool.apply_async(do)
result.get()
pool.close()
pool.join()
test()
You will see the reason why "nothing happens".
Traceback (most recent call last):
File "/tmp/test.py", line 17, in <module>
test()
File "/tmp/test.py", line 12, in test
result.get()
File "/usr/lib/python3.5/multiprocessing/pool.py", line 608, in get
raise self._value
File "/usr/lib/python3.5/multiprocessing/pool.py", line 385, in _handle_tasks
put(task)
File "/usr/lib/python3.5/multiprocessing/connection.py", line 206, in send
self._send_bytes(ForkingPickler.dumps(obj))
File "/usr/lib/python3.5/multiprocessing/reduction.py", line 50, in dumps
cls(buf, protocol).dump(obj)
AttributeError: Can't pickle local object 'test.<locals>.do'
Python multiprocessing.Pool
relies on the pickle
protocol to serialize the data to be sent to the other process. The pickle
protocol can serialize only top level functions and not nested ones.
To see what can be pickled and what cannot check the documentation.
Upvotes: 5