Reputation: 283
Say I have this method that I cache using lru_cache
:
@lru_cache(maxsize=8)
def very_expensive_call(number):
# do something that's very expensive
return number
I am calling this method like this:
print([very_expensive_call(i) for i in range(10)]) # call_1
Because the maxsize of the cache is 8, only numbers 2-9 are cached at this point.
After call_1, I am doing call_2:
print([very_expensive_call(i) for i in range(10)]) # call_2
During call_2, again first number 0
is called (not in cache!), and after that numbers 0 and 3-9 are cached.
Then number 1
is called (not in cache!) and after that numbers 0-1 and 4-9 are cached.
Well, you see where this is going: the cache is never used...
I understand that for this specific example I could alternate between range(...
and reverse(range(...
but in a more complicated scenario that's probably not possible.
Question: Is it possible to inspect which numbers are cached and to order the calls based on that? What would be the overhead for this?
Upvotes: 0
Views: 677
Reputation: 160417
No, no, the cache
used in lru
is specifically designed to not be public-facing. All its internals are encapsulated for thread safety and in order to not break code if the implementation changes.
Apart from that, I don't think it is a good idea to base your input based on caching, you should cache based on your input. If your callable is not periodically called with the same arguments, maybe a cache is not the best option.
Upvotes: 2