Reputation: 11917
I was just checking the size of some datatypes in Python 3 and I observed this.
import sys
val = None
print(sys.getsizeof(val))
The output was 16
as expected.
I tried making a list of 1000 locations of None
and I expected the size to be 16*1000 = 16000
or more. But the result I got was different.
import sys
val = [None]*1000
print(sys.getsizeof(val))
The output was 8064
. Nearly the half of the size I expected.
What is the reason for this.? Why memory allocated is less.?
Upvotes: 2
Views: 1908
Reputation: 696
import sys
val = None
print(sys.getsizeof(val))
Answer:
16
val = []
print(sys.getsizeof(val))
Answer:
72
val = [None]
print(sys.getsizeof(val))
Answer:
80
so [None]*1000 = 1000* 8 + 72 = 8072
Note: No of bytes may vary depending on the environment
Upvotes: 6
Reputation: 33719
There is just a single None
object referenced a thousand times. So the situation is this:
l[0] ----> None
/ ^
l[1] -/ |
…. |
l[999] -----/
And not this:
l[0] ----> None
l[1] ----> None
….
l[999] ----> None
This is more visible when repeating a mutable object, like this:
>>> l = [set()] * 3
>>> print(l)
[set(), set(), set()]
>>> l[0].add(1)
>>> print(l)
[{1}, {1}, {1}]
There is just a single, shared set
object referenced three times, so changes to the set at l[0]
also affect l[1]
and l[2]
.
Python data structures such as list
, set
and dict
are reference-based. In your case, most of the 8064 bytes you observed come from the object references (8 bytes per list element).
Upvotes: 5