Phil
Phil

Reputation: 4870

Fetch large JSON from Datastore

I've created an API on Google Cloud Endpoints that getting all datas from a single entity in Datastore. The NoSQL request (A really simple one : Select * from Entity) is performed with Objectify.

This datastore entity is populated with 200 rows (entities) and each row (entity) has a list of children entities of same kind :

So when I fetch API, a JSON is returned. It's size is about 641Ko and it has 17K lines.

When I look at the API explorer, it tells me that request takes 4 seconds to execute : enter image description here

I would like to decrease that time, because it's a really high one... I've already :

It helps a little but I don't think this is the best efficient way...

Should I use Big Query to generate the JSON file faster ? Or maybe there is another solution ?

Upvotes: 2

Views: 378

Answers (1)

Ramesh Lingappa
Ramesh Lingappa

Reputation: 2488

Do you need all the entity in a single request ?

  • if Not, then you can batch fetch entities using Cursor Queries and display as per your need, say for eg: fetch 20 or 30 entities at a time depending on your need.
  • If Yes,
    • Does your meal entity changes often
      • If No, you can generate a json file and store it in GCS, and whenever your entity changes you can update the json file, so that on the client end fetching will be lot faster and using etag header, new content can be pulled easily
      • If Yes, then i think batch fetching is only effective way to pull those many entities

Upvotes: 5

Related Questions