Robdll
Robdll

Reputation: 6253

How would you store a big amount of JSON

Recently my teem have been asked to find a solution to provide our customers a snapshot of their dashboard at a given time. These dashboards contain various pages (15 plus or less) and each page contains a different number of interactive charts populated by JSON fetched from an external API. Plus or less each page require 10 request to that API, and each request return a JSON.

To accomplish the task we should store all the response for all the pages, and considering an average size of 500byte each it would be: 10 JSON * 15 pages = 150 JSON * 500byte = for a total of 75kb (if i didn't make mistake with the math) for each customer! consider a user-base of 200/300 customer (in the best case).

How would you suggest to store those JSON?

We are considering the following option: DynamoDB AWS S3 MySQL 5.7 (with Json type) MySQL previous than 5.7 (using text or Blob type) MongoDB

But for each option we have some doubts. (E.G.) - We could store all the pages into a single file on S3! Wouldn't it be inefficient to send the user all the data when he wants to see a single page? (usually our user watch 3/4 pages per session ) - What if we use a noSQL? Are we it would be scalable in the long term?

So, in the end, we have a lot of option and no option at the same time! I know isn't easy to understand the best solution with the above info, but i think it's great to hear some advice from the community too. Clearly we are open to any suggestion. Thanks in advance to anyone who wants to help.

Upvotes: 0

Views: 72

Answers (1)

gilgil28
gilgil28

Reputation: 540

It is probably the best to store pages separately and load them on the go only while you need. For better user experience, always load the current page, the previous page and the next page.

Upvotes: 1

Related Questions