Reputation: 2791
I want to collect data in each request (single request can cause several changes), and process the data at the end of the request.
So I'm using a singleton class to collect the data, and I'm processing the data in it on request_finished
signal. Should it work or should I expect data loss/other issues?
singleton class:
class Singleton(type):
_instances = {}
def __call__(cls, *args, **kwargs):
if cls not in cls._instances:
cls._instances[cls] = super(Singleton, cls).__call__(*args, **kwargs)
return cls._instances[cls]
class DataManager(object, metaclass=Singleton):
....
using it in other signal:
item_created = DataManager().ItemCreated(item)
DataManager().add_event_to_list(item_created)
request finished signal:
@receiver(request_finished, dispatch_uid="request_finished")
def my_request_finished_handler(sender, **kwargs):
DataManager().process_data()
Upvotes: 0
Views: 354
Reputation: 161
After further thinking, it's not a good idea to use the request in the signals, neither by global nor by passing it down.
Django has two main paths: Views and Commands. Views are used for web requests, which is generating a 'request' object. Commands are used through the console and do not generate a 'request' object. Ideally, your models (thus, also signals) should be able to support both paths (for example: for data migrations during a project's lifespan). So it's inherently incorrect to tie down your signals to the request object.
It would be better to use something like a thread-local memory space and make sure your thread-global class does not rely on anything from the request. For example: Is there a way to access the context from everywhere in Django?
Upvotes: 1
Reputation: 77912
A singleton means you have one single instance per process. Typical production Django setup is with one or more front server running multiple long running Django processes, each of one serving any incoming request. FWIW you could even serve Django in concurrent threads in a same process AFAICT. In this context, the same user can have subsequent requests served by different processes/threads, and any long-lived 'global' object will be shared by all requests served by the current process. The net result is, as Daniel Roseman rightly comments, that "Singletons will never do what you want in a multi-process multi-user environment like Django".
If you want to collect per-request-response cycle data, your best bet is to store them on the request
object itself, using a middleware to initialize the collector on request processing cycle start and do something with collected data on request processing cycle end. This of course requires passing the request all along... which is indeed kind of clumsy.
A workaround here could be to connect some of your per-request "collector" object's methods as signal handlers, taking care of properly setting the "dispatch_uid" so you can disconnect them before sending the response, and preferably using weak references to avoid memory leaks.
NB : If you want to collect per-user informations, that's what the session
framework is for, but I assume you already understood this.
Upvotes: 2