Reputation: 167
I am using django 1.3 and i am running a script outside of a web context using supervisor.
The memory usage of the process is growing every minute
The code look more or less like this :
while(1):
for auction in auction_list:
auction.update_auction()
db.reset_queries()
db.close_connection()
sleep(1)
Adding close_connection helped me out by avoiding LOCKS on the table but now i have this growing process problem.
How could i manage things to avoid this ?
Upvotes: 0
Views: 415
Reputation: 167
I found a solution. The close_connection() was responsible for the growing memory. It seems that it cames from the connection/disconnection from the database.
I proceeded this way:
while(1):
# Get auction list
auction_list = Auction.objects.all()
# Checking auctions
for auction in auction_list:
auction.update_auction()
# Remove connection statements
db.reset_queries()
# Release lock tables
db.transaction.commit_unless_managed()
# Pause
sleep(1)
With commit_unless_managed() the daemon keeps the connection open and doesn't grow in memory without locking completely the MyISAM table.
Upvotes: 3
Reputation: 32522
Let the process run completely, then let it die. The operating system will reclaim some or all of the python process's resources.
Or, you could consider using something like celery that's built for this.
Upvotes: 1
Reputation: 239200
Umm... because you have the code wrapped in a while(1)
block? Of course it's growing out of control. You've created an infinite loop that continuously queries and updates the database until the end of eternity.
Upvotes: 1