Dolly Santer
Dolly Santer

Reputation: 33

Exceeding Soft private run time memory error

While exporting google spreadsheets from ndb datastore which is hardly of 2 MB size, eats up 128 Mb of run time memory of google app engine? how this is possible? i made a bulk.yaml file also, and i am using gapi calls and defer to export sheet on google app engine and it is showing error of EXceeding run time memory

Upvotes: 2

Views: 87

Answers (1)

Ani
Ani

Reputation: 1457

I've had the same issue with the old Python gdata library when I exported data from Cloud Datastore (NDB lib) to Google Spreadsheets.

Normally, the issue didn't occur at the first export, but often at some later point. I was looking into the memory usage of instances over time and it was increasing with every export job.

The reason was a memory leak in my Python (2.7) code that handled the export. If I remember correctly, I had dicts and lists with plenty of references, some of them potentially in cycles, and the references haven't been explicitly deleted after the job was completed. At least with gdata there was a lot of meta-data in memory for every cell or row the code referenced.

I don't think this is an issue particular to Google App Engine, Spreadsheets, or the libraries you are using, but how Python deals with garbage collection. If there are references left, they will occupy memory.

Upvotes: 2

Related Questions