Grief
Grief

Reputation: 2040

asyncio + multiprocessing + unix

I have a pet project with the following logic:

import asyncio, multiprocessing

async def sub_main():
    print('Hello from subprocess')

def sub_loop():
    asyncio.get_event_loop().run_until_complete(sub_main())

def start():
    multiprocessing.Process(target=sub_loop).start()

start()

If you run it, you'll see:

Hello from subprocess

That is good. But what I have to do is to make start() coroutine instead:

async def start():
    multiprocessing.Process(target=sub_loop).start()

To run it, I have to do something like that:

asyncio.get_event_loop().run_until_complete(start())

Here is the issue: when sub process is created, it gets the whole Python environment cloned, so event loop is already running there:

Process Process-1:
Traceback (most recent call last):
  File "/usr/lib/python3.5/multiprocessing/process.py", line 249, in _bootstrap
    self.run()
  File "/usr/lib/python3.5/multiprocessing/process.py", line 93, in run
    self._target(*self._args, **self._kwargs)
  File "test.py", line 7, in sub_loop
    asyncio.get_event_loop().run_until_complete(sub_main())
  File "/usr/lib/python3.5/asyncio/base_events.py", line 361, in run_until_complete
    self.run_forever()
  File "/usr/lib/python3.5/asyncio/base_events.py", line 326, in run_forever
    raise RuntimeError('Event loop is running.')
RuntimeError: Event loop is running.

I tried to destroy it on subprocess side with no luck but I think that the correct way is to prevent its sharing with subprocess though. Is it possible somehow?

UPDATE: Here is the full failing code:

import asyncio, multiprocessing

import asyncio.unix_events

async def sub_main():
    print('Hello from subprocess')

def sub_loop():
    asyncio.get_event_loop().run_until_complete(sub_main())


async def start():
    multiprocessing.Process(target=sub_loop).start()

asyncio.get_event_loop().run_until_complete(start())

Upvotes: 10

Views: 11384

Answers (2)

Vincent
Vincent

Reputation: 13425

First, you should consider using loop.run_in_executor with a ProcessPoolExecutor if you plan to run python subprocesses from within the loop. As for your problem, you can use the event loop policy functions to set a new loop:

import asyncio
from concurrent.futures import ProcessPoolExecutor

async def sub_main():
    print('Hello from subprocess')

def sub_loop():
    loop = asyncio.new_event_loop()
    asyncio.set_event_loop(loop)
    loop.run_until_complete(sub_main())

async def start(executor):
    await asyncio.get_event_loop().run_in_executor(executor, sub_loop)

if __name__ == '__main__':
    executor = ProcessPoolExecutor()
    asyncio.get_event_loop().run_until_complete(start(executor))

Upvotes: 16

Gerrat
Gerrat

Reputation: 29720

You should always add a check to see how you're running the code (the if __name__ == '__main__': part. Your subprocess is running everything in the module a 2nd time, giving you grief (couldn't resist).

import asyncio, multiprocessing

async def sub_main():
    print('Hello from subprocess')

def sub_loop():
    asyncio.get_event_loop().run_until_complete(sub_main())


async def start():
    multiprocessing.Process(target=sub_loop).start()

if __name__ == '__main__':
    asyncio.get_event_loop().run_until_complete(start())

Upvotes: 1

Related Questions