Reputation: 309
I have a large list containing large objects from a single class:
my_list = [ LargeClass() for i in xrange(10000)]
I need to copy a slice of the list into an auxiliary list, but then to conserve memory, I would like to replace that slice in the original list by a bunch on None's:
new_list = my_list[:1000]
my_list[:1000] = [None] * 1000
My hope was that this would reduce the memory used by 'my_list' so I don't carry around two copies of the same "data". However, this doesn't release any memory. Calling the garbage collector doesn't make any difference either.
Is there any way to accomplish this?
Edit: I should have mentioned that the second list will be passed as an argument to a child process (multiprocessing), so it will be copied. When that's done, I don't need the data in the original list, which is just wasting memory now.
Upvotes: 0
Views: 785
Reputation: 26717
As Roger said, you're not actually copying objects, you're duplicating references:
In [27]: mylist = [object() for i in range(10000)]
In [28]: newlist = mylist[:1000]
In [29]: mylist[0] is newlist[0]
Out[29]: True
is
checks that the objects are the same thing (not merely equal, but the same)
If you want to destroy objects, you need to remove all references to them; which (not what you want to do, but) could be accomplished by simply stating:
mylist = mylist[:1000]
All told though, I've never had the demonstrated need to monkey with Python's GC. Either the objects are small enough where I don't care, or if they're massive (hundreds of MB) then they seem to be cleared in short order. My "problems" with garbage collection are usually problems in my code where I have a sneaky reference that I didn't clean up.
Upvotes: 2
Reputation: 107287
As in comment says , you cannot free the memory with changing the labels so as a better way you can remove the indexes that you don't want :
my_list[:1000] = []
Upvotes: 0