Reputation: 28739
I need a very large list, and am trying to figure out how big I can make it so that it still fits in 1-2GB of RAM. I am using the CPython implementation, on 64 bit (x86_64).
Edit: thanks to bua's answer, I have filled in some of the more concrete answers.
What is the space (memory) usage of (in bytes):
sys.getsizeof([]) == 72
sys.getsizeof([0, 1, 2, 3]) == 104
, so 8 bytes overhead per entry.sys.getsizeof(2**62) == 24
(but varies according to integer size)sys.getsizeof(2**63) == 40
sys.getsizeof(2**128) == 48
sys.getsizeof(2**256) == 66
sizeof(Pyobject)
I guess))
sys.getsizeof(C()) == 72
(C is an empty user-space object)If you can share more general data about the observed sizes, that would be great. For example:
Upvotes: 9
Views: 5382
Reputation: 94475
If you want lists of numerical values, the standard array module provides optimized arrays (that have an append method).
The non-standard, but commonly used NumPy module gives you fixed-size efficient arrays.
Upvotes: 7
Reputation: 4870
point to start:
>>> import sys
>>> a=list()
>>> type(a)
<type 'list'>
>>> sys.getsizeof(a)
36
>>> b=1
>>> type(b)
<type 'int'>
>>> sys.getsizeof(b)
12
and from python help:
>>> help(sys.getsizeof)
Help on built-in function getsizeof in module sys:
getsizeof(...)
getsizeof(object, default) -> int
Return the size of object in bytes.
Upvotes: 11