Reputation: 676
I'm looking for a way to create a decorator to have a function parameter that actually uses a variable passed into the function its wrapping.
for example, lets say I have
@cache_decorator("my_key_{}".format(foo))
def my_function(foo, bar):
pass
@cache_decorator("another_key_{}_{}".format(foo, bar)
def another_function(user, foo, bar):
pass
The goal is to write a caching wrapper. the decorator will need the cache key, but the key is going to include variables passed into the function, and be different for each function it wraps.
Ideally, this lets the decorator do a check for a cached value for a given key, and if its not found execute the function to get the value and cache it. That way if the value is in the cache, its not executing the code that creates the value (i.e. my_function). if its not found, it executes my_function and stores the result in the cache and also returns it.
Another alternative would be something akin to blocks:
def my_function(foo, bar):
cache_value("my_key_{}".format(foo),{block of code to generate value that is only called if necessary})
in Objective-C or js, this would be a block so i can keep the value generation both locally defined and changeable, but only executed if necessary. I'm too new to python to fully grasp how to do this with its verison of closures.
Thanks!
Update
While the solution below worked for decorators, I ended up going the block-like route because of the extra metadata required to attach to each cache entry to ensure it can be properly invalidated. Having this metadata defined with the value generation (as opposed to inside the caching function) is easier to maintain. This looks like:
def my_function(foo, bar):
def value_func():
return code_to_generate_value_using_foo_bar
return get_set_cache(key, value_func, ...)
def get_set_cache(key, value_function, ...):
value = cache.get(key)
if value is None:
value = value_function()
cache.set(key, value)
return value
Upvotes: 4
Views: 3319
Reputation: 91059
You could have your wrapper get a key building function:
@cache_w_keyfunc(lambda foo, bar: (bar,))
def my_function(foo, bar):
pass
@cache_w_keyfunc(lambda user, foo, bar: (foo, bar))
def another_function(user, foo, bar):
pass
The key builder should return things which are hashable, such as a tuple of strings. If they aren't hashable, such as lists, maybe transform them into strings.
This key building function gets the same arguments as the function itself and returns the key to be used.
def cache_w_keyfunc(keyfunc):
def real_decorator(func):
func.cache = {}
@functools.wraps(func)
def wrapper(*args, **kwargs):
# Create the key now out of the wrapped function's name and the given keys:
key = (func.__name__, keyfunc(*args, **kwargs))
try:
return func.cache[cache_key]
except KeyError:
value = func(*args, **kwargs)
func.cache.set(cache_key, value)
return value
return wrapper
return real_decorator
Upvotes: 3
Reputation: 83
You could pass two lists when creating the decorator. First one would contain the list of positions for the positional arguments and the second one would contain the list of parameter names for keyword arguments.
def cached(positions, names):
def cached_decorator(func):
@functools.wraps(func)
def wrapper(*args, **kwargs):
keys = [func.__name__] + [str(kwargs.get(name)) for name in sorted(names)] + [str(args[position]) for position in positions]
cache_key = '_'.join(keys)
cached_value = cache.get(cache_key)
if cached_value:
return cached_value
value = func(*args, **kwargs)
cache.set(cache_key, value)
return cached_value
return wrapper
return cached_decorator
and you would use it like this
# this will cache the function using b and name parameters
@cached([1], ["name"])
def heavy_calc(a, b, c, name=None):
something_realy_slow()
return answer
The problem is that you should also serialize the answer of the function and deserialize when retrieving from cache. Another problem is two different call function can give the same key (heavy_calc("hello_there", "foo")
and heavy_calc("hello", "there_foo")
). The solution to this would be to create a serialization of args and kwargs using json or msgpack so you can be sure that the keys would be unique.
If you are using Python 3.3 and you don't need to select the parameters on which to cache on you can use functools.lru_cache
Upvotes: 1
Reputation: 15680
Have you seen dogpile.cache
?
It's a caching system that does exactly this.
You might be able to just use dogpile. If not, you can look at it's source to see exactly how it works.
incidentally, dogpile.cache handles all of the little details that you should worry about:
Upvotes: 1