Reputation:
I have a few django micro-services. Their main workload is constant background processes, not request handling.
The background processes constantly use Django's ORM, and since I needed to hack a few things for it to work properly (it did for quite a while), now I have problems with the DB connection, since Django is not really built for using DB connections a lot in the background I guess...
Celery is always suggested in these cases, but before switching the entire design, I want to know if it really is a good solution.
Can celery tasks (a lot of tasks, time-consuming tasks) use Django's ORM in the background without problems?
Upvotes: 0
Views: 2597
Reputation: 52
You can import django env with the following steps
import os
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "cmdb.settings")
import django
django.setup()
Upvotes: -1
Reputation: 32347
Can celery tasks (a lot of tasks, time-consuming tasks) use Django's ORM in the background without problems?
Yes, depending on your definition of “problems” :-)
More seriously: The Django ORM performance will be mostly limited by the performance characteristics of the underlying database engine.
If your chosen database engine is PostgreSQL, for example, you will be able to handle a high volume of concurrent connections.
Upvotes: 4
Reputation: 599796
Celery was originally written specifically as an offline task processor for Django, and although it was later generalised to deal with any Python code it still works perfectly well with Django.
How many tasks there are and how long they take is pretty much irrelevant to the choice of technology; each Celery worker runs as a separate process, so the limiting resource will be your server capacity.
Upvotes: 0