Reputation: 7202
I have a celery task:
@task
def foo():
part1()
part2()
part3()
...that I'm breaking up into a chain of subtasks
@task
def foo():
@task
def part1():
...
@task
def part2():
...
@task
def part3():
...
chain(part1.s(), part2.s(), part3.s()).delay()
The subtasks are inner funcs because I don't want them executed outside the context of the parent task. Issue is my worker does not detect and/or register the inner tasks (I am using autoregister
to discover apps and task modules). If I move them out to the same level in the module as the parent task foo
, it works fine.
Does celery support inner functions as tasks? If so, how do I get workers to register them?
Upvotes: 1
Views: 1449
Reputation:
The problem with your code is that you get new definitions of part1
every time you call foo()
. Also note that not a single part1
function is created until you call foo
so it is not possible for celery to register any one of part
functions to be created when it initializes a worker.
I think the following code is the closest to what you want.
def make_maintask():
@task
def subtask1():
print("do subtask")
# ...
@task
def _maintask():
chain(subtask1.si(), subtask2.si(), subtask3.si()).delay()
return _maintask
maintask = make_maintask()
This way, each definition of subtask
and such is not visible from outside.
If all you want to do is hiding subtask
, please think twice. The designer of the python language didn't believe that one needs a access control such as public and private as in java. It is a functionality that severely complicates a language with a dubious advantage. I think well-organized packages and modules and good names (such as adding underscores in front) can solve all your problems.
If all _maintask
does is delegating subtasks to other subprocesses, you don't really need to define it as a celery task. Don't make a celery task call another celery task unless you really need it.
Upvotes: 2