Reputation: 2620
I know it's a common question, and related question like this, but I want to ask for best way to fit my scenario because I haven't used celery now.
My service scenario will use multiprocessing.Process to create multi-campaign order, in each campaign order, it still use multiprocessing.Process to create multi-ad (campaign and ad are 1toM relationship).
As you know, if I set multi-process on both campaign and ad creation part, it will fail with "daemonic processes are not allowed to have children", I think celery may meet similar problem even though I haven't used it now.
My question is, what is the general way to solve this kind of problem? should I use celery still, or any way to work around it?
Thanks a lot
Upvotes: 13
Views: 10773
Reputation: 13748
You should use a message queue to decouple
For example,
main program, create task , push to queue_1
multi-campaign worker get task from queue_1, process, and push some multi-ad task
to queue_2
multi-ad worker get task from queue_2 , process , done.
The logic is simple, easy to implement by yourself. Also there are some existing libs for such stuff, such as rq
/celery
.
If get AssertionError
, use thread instead
def run_in_subprocess(func, *args, **kwargs):
from multiprocessing import Process
thread = Process(target=func, args=args, kwargs=kwargs)
thread.daemon = True
thread.start()
return thread
def run_in_thread(func, *args, **kwargs):
from threading import Thread
thread = Thread(target=func, args=args, kwargs=kwargs)
thread.daemon = True
thread.start()
return thread
def run_task(config):
try:
run_in_subprocess(xxxx_task, config)
except AssertionError:
print('daemonic processes are not allowed to have children, use thread')
run_in_thread(xxxx_task, config)
I use this code in some demo app, but I don't recommend use in production.
Upvotes: 4