user910210
user910210

Reputation: 313

How can I share a cache between Gunicorn workers?

I am working on a small service using Gunicorn and Flask (Python 3.6).The pseudocode below shows roughly the behavior I want. There are a lot of serialized foo objects, and I want to hold as many of these in memory as possible and delete them on a LRU basis.

cache = Cache()

@app.route('/')
def foobar():
    name = request.args['name']
    foo = cache.get(name)
    if foo is None:
        foo = load_foo(name)
        cache.add(foo)

    return foo.bar()

The problem I am having is I do not know how to share this cache between Gunicorn workers. I'm working with limited memory and don't want to be holding duplicate objects. Certain objects will be used very often and some probably never, so I think it really makes sense to hold them in memory.

This is just something that will only be taking requests from another application (both running on the same server), I just wanted to keep this code separate. Am I going the completely wrong direction by even using Gunicorn in the first place?

Upvotes: 14

Views: 6691

Answers (1)

Max Paymar
Max Paymar

Reputation: 708

I don't see anything wrong with using Gunicorn, but it's probably not necessary to think about scaling horizontally unless you are close to putting this into production. Anyway, I'd recommend using a separate service as a cache, rather than having one in python memory. That way, each worker can open a connection to the cache as needed. Redis is a popular option, but you may have to do some data manipulation to store the data, e.g. store the data as a JSON string rather than a python object. Redis can act as a LRU cache by configuring it: https://redis.io/topics/lru-cache

Upvotes: 1

Related Questions