m9_psy
m9_psy

Reputation: 3386

How i should store simple objects using python and redis?

Lets suppose that I have a lot of(hundreds) big python dictionaries. Pickled file size is about 2Mb. I want to draw chart using data from this dictionaries so i have to load them all. What is the most efficent (at first speed, at second memory) way to store my data? May be I should use another caching tool? This how i am solving this task now:

  1. Pickle every my dictionary. Just pickle(dict)
  2. Load the pickled string to redis. redis.set(key, dict)
  3. When user needs chart, i am creating array and fill it with unpickled data from redis. Just like that:

    array = []
    
    for i in range(iteration_count):
    
        array.append(unpickle(redis.get(key)))
    

Now i have both problems: with memory, cause my array is very big, but its not important and easy to solve. The main problem - speed. A lot of objects unpickling more than 0.3 seconds. I even have bottlenecks with more than 1 second unpickling time. And getting this string from redis rather expensive (more than 0.01 sec). When i have lots of objects, my user have to wait a lot of seconds.

Upvotes: 0

Views: 620

Answers (1)

bradodarb
bradodarb

Reputation: 366

If it can be assumed that you are asking in the context of a web application and that you are displaying your charts in a browser, I would definitely recommend storing your dictionaries as JSON in redis.

Again, you have not provided too many details about your application, but I have implemented charting over very large data sets before (100,000's of sensor data points per second over several minutes of time). To help performance when rendering the datasets, I stored each type of data into their own dictionary or 'series'. This strategy allows you to render only portions of the data as required.

Perhaps if you share more about your particular application we may be able to provide more help.

Upvotes: 1

Related Questions