Reputation: 192
I am considering using large numbers of gensyms to differentiate between objects in a system I'm building (like refs in erlang).
Should I expect to run into system limits after creating large numbers of gensyms?
For reference, I'm using SBCL.
Upvotes: 2
Views: 111
Reputation: 48745
Different implementations use different amount of memory. From just testing the number of bytes used by gensym
it is dependent on the argument you pass it and how unique that is from previous rounds..
If you have a macro that always pass a fixed number of strings to gensym
it will use 0,5-1,5kB per. For every consecutive using the same argument its down to 65-150 bytes or so.
I had it make 65 byte gensyms for a while and stopped it well above 4 billions, but I don't know if that qualify since "large" is ambiguous.
Upvotes: 2