Reputation: 37
I have a celery well configured and working with django. On post_save signal, I send a new record to a set using a task and using another periodic task, I m trying to consume that set.
from __future__ import absolute_import, unicode_literals
from celery import shared_task
class Data():
def __init__(self):
self.slotshandler = set()
global data
data = Data()
@shared_task
def ProcessMailSending(): #This is a periodic task, running every 30 seconds
global data #This variable is always empty here
while slotshandler:
slot_instance = slotshandler.pop()
print("sending mail for slot } to {} by mail {}".format(slot_instance .id,slot_instance .user,slot_instance .user_mail))
@shared_task
def UpdateSlotHandler(new_slot): #This is called by save_post django signal
global data
data.slotshandler.add(new_slot) #filling the set at each new record
The problem is that this task doesn't see my newly added time slots. note that this django app is run on a micro service for sending reminder mails to users.
Upvotes: 0
Views: 367
Reputation: 20702
Different celery tasks spawn different processes, which don't share access to memory. i.e. your global isn't persistent between these processes. When your first task finishes, all memory associated with its processes is flushed. Your second task creates an entire new set of objects in memory, including your global variable.
You really need to save data in something more persistent, either a db or in-memory cache (Memcache for example).
Upvotes: 1