Reputation: 9823
I have a function that downloads and saves html pages. Over time, the memory is not being released and the program becomes slow. How can I force release this memory (or what is taking up this memory?
I think the problem may be with reading, writing the file. Although I call close(), could there be another issue?
The following code is inside a for loop (this is done 1000+ times)
openFile = None
try:
#download the page
pageText = getPageAsText(url)
#write file to disk
fileName = name
openFile = open(os.path.expanduser('~')+STATIC_DIRECTORY+'/'+name, 'w')
openFile.write(pageText)
except Exception:
traceback.print_exc()
finally:
if openFile is not None:
openFile.close()
Upvotes: 0
Views: 180
Reputation: 2921
You can del
to dereference not used variables. It gives a hint to garbage collector to collect the memory earlier. Without your entire code, we cannot see where the memory leak occurs.
I would also rewrite your code with with
, so it handles close by itself.
pageText = getPageAsText(url)
with open(os.path.expanduser('~')+STATIC_DIRECTORY+'/'+name, 'w') as f:
f.write(pageText)
Upvotes: 2
Reputation: 6186
You can call gc
manually as gc.
But your data is not released so it doesn't work properly.
add the code in the end of loop
pageText = None
fileName = None
openFile = None
Python gc will work automatically.
Upvotes: 0
Reputation: 15511
I have a feeling the problem may be elsewhere, but to force garbage collection:
import gc
gc.collect() # force garbage collection
Upvotes: 0