Reputation: 45
Lets say I have the follwing code where I need tasks that depend on other tasks , but some tasks are long running , and needs some kind of .join
asyncio alike where it waits until one tasks ends to continue with the next, but how do I schedule in the beat_schedule
?
beat_schedule = {
'every-2-seconds': {
'task': 'celery_app.tasks.add',
'schedule': timedelta(seconds=2),
'args': (5, 8)
},
'specified-time': {
'task': 'celery_app.tasks.add',
'schedule': crontab(hour=8, minute=50),
'args': (50, 50)
}
}
def chain_demo():
tasks = [
add_demo.si(10, 7),
mul_demo.si(10),
insert_db_demo.si(),
]
chain(*tasks).apply_async()
@app.task
def add_demo(x, y):
time.sleep(3)
return x + y
@app.task
def mul_demo(x, y):
time.sleep(3)
return x * y
@app.task(ignore_result=True)
def insert_db_demo(result):
print('insert db , result {}'.format(result))
Upvotes: 1
Views: 1409
Reputation: 10699
chain_demo()
which uses celery.chain()
should be passed with the task signatures (using .s()
) that are piped (|
) next to each other e.g. chain(add.s(4) | mul.s(8))
. Reference: https://docs.celeryproject.org/en/latest/getting-started/next-steps.html#chains2.1. If you want to use celery_beat to perform periodic/scheduled calls to the entrypoint function chain_demo()
, then you must make it a celery task instead of an ordinary python function. That is why the beat_schedule
config asks for a task name. As stated in https://docs.celeryproject.org/en/latest/userguide/periodic-tasks.html
celery beat is a scheduler; It kicks off tasks at regular intervals,...
2.2. In case you have other things in mind to invoke the entrypoint function chain_demo()
other than using celery_beat e.g. if you would call it from a Django app, or you would call it from an ordinary python script, then it doesn't necessarily need to be a celery task (though it is also possible), it could be just an ordinary python function.
./tasks.py
import time
from celery import Celery, chain
app = Celery("tasks")
app.conf.update(
beat_schedule={
"every-30-seconds": {
"task": "tasks.chain_demo",
"schedule": 30,
},
},
)
@app.task
def chain_demo():
tasks = \
add_demo.s(10, 7) \
| mul_demo.s(10) \
| insert_db_demo.s()
chain(tasks).apply_async()
def chain_demo_non_task():
tasks = \
add_demo.s(10, 7) \
| mul_demo.s(10) \
| insert_db_demo.s()
chain(tasks).apply_async()
@app.task
def add_demo(x, y):
time.sleep(3)
return x + y
@app.task
def mul_demo(x, y):
time.sleep(3)
return x * y
@app.task(ignore_result=True)
def insert_db_demo(result):
print('insert db , result {}'.format(result))
Run worker
celery --app=tasks worker --loglevel=INFO
celery --app=tasks beat --loglevel=INFO
Result (task was invoked for every 30 seconds):
# Run 1
[2021-05-06 15:10:37,590: INFO/MainProcess] Received task: tasks.chain_demo[c452cf5c-b7cb-4b2b-9284-ada0b19fdedc]
[2021-05-06 15:10:37,618: INFO/ForkPoolWorker-2] Task tasks.chain_demo[c452cf5c-b7cb-4b2b-9284-ada0b19fdedc] succeeded in 0.02621765899948514s: None
[2021-05-06 15:10:37,619: INFO/MainProcess] Received task: tasks.add_demo[bfaf0b5a-9cd9-43e6-96b1-9bbf11da1a82]
[2021-05-06 15:10:40,625: INFO/ForkPoolWorker-2] Task tasks.add_demo[bfaf0b5a-9cd9-43e6-96b1-9bbf11da1a82] succeeded in 3.0041544650002834s: 17
[2021-05-06 15:10:40,627: INFO/MainProcess] Received task: tasks.mul_demo[a104c1e3-175e-4fca-9911-88c30157e96f]
[2021-05-06 15:10:43,631: INFO/ForkPoolWorker-2] Task tasks.mul_demo[a104c1e3-175e-4fca-9911-88c30157e96f] succeeded in 3.0032322850001947s: 170
[2021-05-06 15:10:43,632: INFO/MainProcess] Received task: tasks.insert_db_demo[fce0d2c4-43cc-452e-b3a7-6f0c39cff9fb]
[2021-05-06 15:10:43,633: WARNING/ForkPoolWorker-2] insert db , result 170
[2021-05-06 15:10:43,633: INFO/ForkPoolWorker-2] Task tasks.insert_db_demo[fce0d2c4-43cc-452e-b3a7-6f0c39cff9fb] succeeded in 0.00018388900025456678s: None
# Run 2
[2021-05-06 15:11:07,567: INFO/MainProcess] Received task: tasks.chain_demo[f3851f64-24f0-454c-a1db-fe2c6f1c59ce]
[2021-05-06 15:11:07,569: INFO/ForkPoolWorker-2] Task tasks.chain_demo[f3851f64-24f0-454c-a1db-fe2c6f1c59ce] succeeded in 0.0011563549996935762s: None
[2021-05-06 15:11:07,571: INFO/MainProcess] Received task: tasks.add_demo[0ddf340e-78d5-4b13-9190-6815b1b03106]
[2021-05-06 15:11:10,575: INFO/ForkPoolWorker-2] Task tasks.add_demo[0ddf340e-78d5-4b13-9190-6815b1b03106] succeeded in 3.0036702569996123s: 17
[2021-05-06 15:11:10,576: INFO/MainProcess] Received task: tasks.mul_demo[e68b81a4-723f-40df-97a0-b3ab63f29e03]
[2021-05-06 15:11:13,584: INFO/ForkPoolWorker-2] Task tasks.mul_demo[e68b81a4-723f-40df-97a0-b3ab63f29e03] succeeded in 3.007046304999676s: 170
[2021-05-06 15:11:13,588: INFO/MainProcess] Received task: tasks.insert_db_demo[f7d0d9e1-7c2e-48af-a4a8-3d6625d351d3]
[2021-05-06 15:11:13,589: WARNING/ForkPoolWorker-2] insert db , result 170
[2021-05-06 15:11:13,589: INFO/ForkPoolWorker-2] Task tasks.insert_db_demo[f7d0d9e1-7c2e-48af-a4a8-3d6625d351d3] succeeded in 0.0001733720000629546s: None
>>> import tasks
>>> tasks.chain_demo.apply_async() # Using the celery task asynchronously
<AsyncResult: 46fe65b2-9575-477e-bcdf-11f223db765b>
>>> tasks.chain_demo() # Using the celery task directly
>>> tasks.chain_demo_non_task() # Using the ordinary function
>>>
Result:
# Using the celery task asynchronously
[2021-05-06 15:33:51,209: INFO/MainProcess] Received task: tasks.chain_demo[46fe65b2-9575-477e-bcdf-11f223db765b]
[2021-05-06 15:33:51,211: INFO/ForkPoolWorker-2] Task tasks.chain_demo[46fe65b2-9575-477e-bcdf-11f223db765b] succeeded in 0.0017580129997440963s: None
[2021-05-06 15:33:51,214: INFO/MainProcess] Received task: tasks.add_demo[9b4e6577-856b-457b-9310-44d7dda8979b]
[2021-05-06 15:33:54,221: INFO/ForkPoolWorker-2] Task tasks.add_demo[9b4e6577-856b-457b-9310-44d7dda8979b] succeeded in 3.0058788619999177s: 17
[2021-05-06 15:33:54,225: INFO/MainProcess] Received task: tasks.mul_demo[e5f9b57b-290d-4a53-92d2-3a7c69c8890c]
[2021-05-06 15:33:57,232: INFO/ForkPoolWorker-2] Task tasks.mul_demo[e5f9b57b-290d-4a53-92d2-3a7c69c8890c] succeeded in 3.00442869500057s: 170
[2021-05-06 15:33:57,236: INFO/MainProcess] Received task: tasks.insert_db_demo[28ab6ab3-2401-45be-ab67-bff9febbe856]
[2021-05-06 15:33:57,239: WARNING/ForkPoolWorker-2] insert db , result 170
[2021-05-06 15:33:57,239: INFO/ForkPoolWorker-2] Task tasks.insert_db_demo[28ab6ab3-2401-45be-ab67-bff9febbe856] succeeded in 0.00012745499952870887s: None
# Using the celery task directly
[2021-05-06 15:34:24,018: INFO/MainProcess] Received task: tasks.add_demo[7395866f-54fe-4d02-8114-ae6ac044e25a]
[2021-05-06 15:34:27,026: INFO/ForkPoolWorker-2] Task tasks.add_demo[7395866f-54fe-4d02-8114-ae6ac044e25a] succeeded in 3.0071180750001076s: 17
[2021-05-06 15:34:27,030: INFO/MainProcess] Received task: tasks.mul_demo[2a2b1452-51d4-49bd-af06-9248895ecc9f]
[2021-05-06 15:34:30,039: INFO/ForkPoolWorker-2] Task tasks.mul_demo[2a2b1452-51d4-49bd-af06-9248895ecc9f] succeeded in 3.005947522000497s: 170
[2021-05-06 15:34:30,041: INFO/MainProcess] Received task: tasks.insert_db_demo[7095f69f-a02d-4eb7-815e-22e69e2eb4dd]
[2021-05-06 15:34:30,041: WARNING/ForkPoolWorker-2] insert db , result 170
[2021-05-06 15:34:30,042: INFO/ForkPoolWorker-2] Task tasks.insert_db_demo[7095f69f-a02d-4eb7-815e-22e69e2eb4dd] succeeded in 0.00013701599982596235s: None
# Using the ordinary function
[2021-05-06 15:34:38,889: INFO/MainProcess] Received task: tasks.add_demo[c0f30ae6-0c92-4991-859a-b44f8ef8f88d]
[2021-05-06 15:34:41,897: INFO/ForkPoolWorker-2] Task tasks.add_demo[c0f30ae6-0c92-4991-859a-b44f8ef8f88d] succeeded in 3.0069600609995177s: 17
[2021-05-06 15:34:41,899: INFO/MainProcess] Received task: tasks.mul_demo[ec4b1a9f-b8c8-440f-af37-d1a905ce9747]
[2021-05-06 15:34:44,905: INFO/ForkPoolWorker-2] Task tasks.mul_demo[ec4b1a9f-b8c8-440f-af37-d1a905ce9747] succeeded in 3.0045316010000533s: 170
[2021-05-06 15:34:44,907: INFO/MainProcess] Received task: tasks.insert_db_demo[7d36777b-e8e1-4983-894f-0539f476874d]
[2021-05-06 15:34:44,908: WARNING/ForkPoolWorker-2] insert db , result 170
[2021-05-06 15:34:44,908: INFO/ForkPoolWorker-2] Task tasks.insert_db_demo[7d36777b-e8e1-4983-894f-0539f476874d] succeeded in 0.00018011399970419006s: None
Upvotes: 2