Reputation:
I have a struct, say:
type ASDF struct {
A uint64
B uint64
C uint64
D uint64
E uint64
F string
}
I create a slice of that struct: a := []ASDF{}
I do operations on that slice of the struct (adding/removing/updating structs that vary in contents); how can I get the total size in bytes (for memory) of the slice and its contents? Is there a built-in to do this or do I need to manually run a calculation using unsafe.Sizeof
and then len
each string?
Upvotes: 11
Views: 30025
Reputation: 363
// the internal structure of string type
type _string struct {
elements *byte // pointer to underlying bytes
len int // number of bytes
}
// more accurate
func (s *ASDF) size() int {
size := int(unsafe.Sizeof(*s))
size += len(s.F) + int(unsafe.Sizeof("")) // <-- size of the string internal structure,
//using "" as it have zero length and we calculate only the size of string header (_string struct)
return size
}
Upvotes: 0
Reputation: 166569
Sum the size of all memory, excluding garbage collector and other overhead. For example,
package main
import (
"fmt"
"unsafe"
)
type ASDF struct {
A uint64
B uint64
C uint64
D uint64
E uint64
F string
}
func (s *ASDF) size() int {
size := int(unsafe.Sizeof(*s))
size += len(s.F)
return size
}
func sizeASDF(s []ASDF) int {
size := 0
s = s[:cap(s)]
size += cap(s) * int(unsafe.Sizeof(s))
for i := range s {
size += (&s[i]).size()
}
return size
}
func main() {
a := []ASDF{}
b := ASDF{}
b.A = 1
b.B = 2
b.C = 3
b.D = 4
b.E = 5
b.F = "ASrtertetetetetetetDF"
fmt.Println((&b).size())
a = append(a, b)
c := ASDF{}
c.A = 10
c.B = 20
c.C = 30
c.D = 40
c.E = 50
c.F = "ASetDF"
fmt.Println((&c).size())
a = append(a, c)
fmt.Println(len(a))
fmt.Println(cap(a))
fmt.Println(sizeASDF(a))
}
Output:
69
54
2
2
147
http://play.golang.org/p/5z30vkyuNM
Upvotes: 8
Reputation: 92976
I'm afraid to say that unsafe.Sizeof
is the way to go here if you want to get any result at all. The in-memory size of a structure is nothing you should rely on. Notice that even the result of unsafe.Sizeof
is inaccurate: The runtime may add headers to the data that you cannot observe to aid with garbage collection.
For your particular example (finding a cache size) I suggest you to go with a static size that is sensible for many processors. In almost all cases doing such micro-optimizations is not going to pay itself off.
Upvotes: 3