Reputation: 4434
I want to share small pieces of informations between my worker nodes (for example cached authorization tokens, statistics, ...) in celery.
If I create a global inside my tasks-file it's unique per worker (My workers are processes and have a life-time of 1 task/execution).
What is the best practice? Should I save the state externally (DB), create an old-fashioned shared memory (could be difficult because of the different pool implementations in celery)?
Thanks in advance!
Upvotes: 9
Views: 9495
Reputation: 4434
I finally found a decent solution - core python multiprocessing-Manager:
from multiprocessing import Manager
manag = Manager()
serviceLock = manag.Lock()
serviceStatusDict = manag.dict()
This dict can be accessed from every process, it's synchronized, but you have to use a lock when accessing it concurrently (like in every other shared memory implementation).
Upvotes: 10