Reputation: 9859
I have very simple django app:
models.py:
class Product(models.Model):
name = models.CharField(max_length=1000, default="")
desc = models.TextField(default="")
views.py:
from django.http import HttpResponse
from models import Product
def fetch(request):
for p in Product.objects.all()[:300000]:
pass
return HttpResponse("done")
I've loaded 300k sample records in MySQL database, turned debug off in settings.py
and tried executed fetch
view - after it completes, django still sits on 700Mb of RAM
I understand that it needs memory to fetch all these 300k objects, but why on earth it keeps them after view functions exits?
Again, I'm with DEBUG=False
, tried this with django dev web server and also with uwsgi
and its the same weird behavior.
P.S. Verified with Django 1.4 and 1.5.4 on both python2.6/2.7 Linux 64-bit
Upvotes: 0
Views: 377
Reputation: 599600
This is not really anything to do with Django. Generally, Python does not return memory to the operating system until it needs to.
See the effbot's explanation for more detail.
Upvotes: 3
Reputation: 47172
This line is the culprit
for p in Product.objects.all()[:300000]:
The slicing forces the QuerySet to evaluate, ie. hit the database, and it then returns a list()
containing the slice of objects, which you then can iterate over.
And Django's QuerySet cache keeps it in memory since you might want to iterate over the same "QuerySet" again.
You could optimize this by using an iterator.
It's in the docs here
Upvotes: 0