angryweasel
angryweasel

Reputation: 376

Best way to propagate cache-control policy in a FastAPI application?

Using FastAPI and FastAPICache, we can use the Cache-Control header to allow the API response to be force-computed or allowed to returned from a cache.

For the endpoint as a whole, when the Cache-Control policy allows it, if the same request object is seen again, it can be returned from a cache.

I want a similar behaviour at the function-level. For some function bar within my endpoint foo (potentially nested and in another file/module), which only takes one argument a, how can it tell if it's allowed to return from cache?

Obviously we could do something like bar(a=request.a, cache_control=request.headers.cache_control), but I'm thinking something like:

In main.py

@router.post("/foo")
def foo(request: Request):
    return bar(a=request.body.a)

In utils.py

def bar(a: int):

    # something like:
    if fastapi.current_context().cache_control().no_cache():
        val = some_expensive_computation(a)
    else:
        val = cache.get(a, some_expensive_computation(a))
        
    if not fastapi.current_context().cache_control().no_store():
        cache.set(a, val)
        
    return val

Upvotes: 2

Views: 435

Answers (0)

Related Questions