Kiki Mirizio
Kiki Mirizio

Reputation: 33

Python proxy to store data in memory

I have started a small project using python3 with flask_restplus and I need to go in production so I face the need of reworking the code and I need some suggestions to make it more robust, here is the description of my question:

Description

This proxy periodically requests some data from another service and exposes the data through a REST API so an UI can consume it. Since UI needs to display data in realtime like, dropdowns, dinamic fields update , some of the data that the proxy holds is cached in memory to give the user a real time experience instead of waiting 3 seconds for a dropdown to display XD.

So for building a cache, I'm bassically storing some of the data in an global variable dictionary type since all the responses that comes from the other service are JSON and this allow me to fast retrieve/use some of the data when needed.

dataCache = {} #this is where I store the cache items

  def refresh_cache():  #function to refresh cache periodically

       start = time.time()

       #Get all service desks
       get_all_project()

       #Get all organizations
       get_all_organizations_by_project()

       #Get all customers by Desks
       get_all_customers_by_project()

       print("Refresh Cache Took "+str(time.time()-start)+"")

       threading.Timer(app.config['CACHE_REFRESH'], refresh_cache).start()

I know is a terrible idea but for the sake of having a draft version and validate requirements worked perfectly,but now I need to grow.

Final question

What could be valid options to replace this awfull global variable?

My thoughs

I think on, mounting an sqlite3 in memory for the sake of speed, and store all this data there, since most of the time I need to do some ¨SQL joins query like" using for loops and IF statements while handling this data before sending it to the UI. E.g

        for rType in dataCache[companyName]['types']:
           if dataCache[companyName]['projectId']==id:
             bla bla bla bla

PS: I dont need overkill/enterprise solutions since the amount of users wont exceed 1k time high.

Upvotes: 0

Views: 345

Answers (1)

Adrian Krupa
Adrian Krupa

Reputation: 1937

TLDR: Look at memcached or redis. They are both easy to deploy and configure, have python clients.

Bear in mind that you probably run your app through some wsgi server and span multiple processes. In your solution every process would have it's own copy of cache. That's not desired because you would need to provide some cache refreshing synchronization if you need consistency between requests.

I would create simple key-value storage for cache. Thanks to that your python app would be stateless and be ready to answer requests just after start. Very useful if you use gevent worker class in gunicorn.

Upvotes: 1

Related Questions