JackS
JackS

Reputation: 137

How to avoid "safety" over quota panic when accessing datastore ? (billing is enabled)

I deployed my site to Google App Engine (using Golang and datastore with 1000 records). billing is enabled and a daily budget established. The Quota Details page indicates everything is under quota. I am doing an Urlfetch to obtain a tsv file that I use to build data entities in the datastore.

Two problems:

  1. Only 778 entities are create - log indicates it is a long running process but it appears to terminate prematurely without error message. Docs say this is normal
  2. The second step involves creating a json file from the entities in the datastore. This process causes a "Panic: overquota" because the process is taking too long I suppose.

How do I proceed? Should I divide the tsv datafile into several smaller files? Can I request "more time" so I don't go over the safety quotas?

Important to note is that the datastore part of the developers console is showing some problems: Although my application has access to 778 datastore entities, the console only reports 484 entities of that kind with a total of only 704 entities of all kinds (actually are 933)

I've been working at this for a while and am wondering if there is something going on with the system or are there things I can do to get my data entities set up properly. I also wish I could find more to read about safety quotas... ... and get the remote api working! thanks!

Upvotes: 2

Views: 130

Answers (1)

Jesse
Jesse

Reputation: 8393

It really depends on where you are doing this processing for both of these use cases within the appengine platform.

For example if you are performing a urlfetch for the file to process within a frontend instance then you have 60 seconds to do all this processing. App Engine requires that frontend instances respond to each request within 60 seconds.

I'm making an assumption that this is what you are doing, as your request is being terminated. To get around this time restriction you should move this type of batch data processing to the taskqueue where each task is required to completed within 10 minutes.

The same holds true for your reads. Either you need to look at how your reading data from the datastore or you need to batch it up with either deferred task or a pipeline.

Do you have a snippet that you can share for how you are composing your json?

Upvotes: 0

Related Questions