Reputation: 2792
I was watching one of Raymond Hettinger's awesome videos and I got a bit confused on the decorator example:
def cache(func):
saved={}
@wraps(func)
def newfunc(*args):
if args in saved:
return newfunc(*args) # should be return saved[args]?
result = func(*args)
saved[args]=result
return result
return newfunc
I'm not particularly an expert on decorators but doesn't the return of the call to newfunc(*args) upon finding the item is cached cause a recursive loop that never finishes? I think it is suppose to return saved[args] (the function eventually return's result, which is the same thing, but I don't think it ever gets there if an item is found in cache.)
Upvotes: 4
Views: 3638
Reputation: 51
You could also use @functools.lru_cache(maxsize=128, typed=False)¶ This would look like:
from functools import lru_cache
@lru_cache(maxsize=None)
def fib(n):
if n < 2:
return n
return fib(n-1) + fib(n-2)
Upvotes: 5
Reputation: 366203
Yes, that's a mistake.
If you're not sure, let's test it:
def fib(n):
if n < 2:
return 1
return fib(n-2) + fib(n-1)
print(fib(10))
@cache
def cfib(n):
if n < 2:
return 1
return cfib(n-2) + cfib(n-1)
print(cfib(10))
The first one prints out 89
, the second one aborts:
File "rhcache.py", line 8, in newfunc
return newfunc(*args) # should be return saved[args]?
File "rhcache.py", line 8, in newfunc
return newfunc(*args) # should be return saved[args]?
# ... 997 more copies
RuntimeError: maximum recursion depth exceeded
But if we change it as you suggest, it prints 89
again. (And, if you time it, it runs faster than the non-cached version; if you profile it, it makes only 10 calls to the real function; etc.)
All exactly as you expected.
So, what have we learned? That even Raymond Hettinger is not above occasional typos in untested code, but his code is clean enough that it's easy to find and fix the problem even without running it. :)
You could send him an email, add a comment on the YouTube page, or report an issue on the PyVideo page.
Upvotes: 4