Reputation: 111
I try that: main.py
import tasks
if __name__ == '__main__':
result = tasks.add.apply_async(([4, 4]), queue='broadcast_tasks')
result.ready()
value = result.get()
print(value)
tasks.py
from celery import Celery
from kombu.common import Broadcast
app = Celery('tasks',
broker='redis://localhost:6379/0',
backend='redis://localhost:6379/1',
include=['tasks'])
app.conf.update(
result_expires=3600,
)
task_serializer = 'json'
result_serializer = 'json'
accept_content = ['json']
timezone = 'UTC'
app.conf.task_queues = (Broadcast('broadcast_tasks'),)
app.conf.task_routes = {'tasks.add': {'queue': 'broadcast_tasks'}}
@app.task
def add(x, y):
return x + y
I'd like to broadcast task to all workers with redis (broker and backend) and celery, but I didn't achieve, can you help me please ?
Upvotes: 1
Views: 1869
Reputation: 3477
The celery team is merging the corresponding issue: https://github.com/celery/celery/pull/3934
This should be available in the next release (I hope).
Upvotes: 1