Reputation: 3166
I have been getting this error FATAL ERROR: JS Allocation failed - process out of memory
and I have pinpointed it to be the problem that I am sending really really large json object to res.json
(or JSON.stringify
)
To give you some context, I am basically sending around 30,000 config files (each config file has around 10,000 lines) as one json object
My question is, is there a way to send such a huge json object or is there a better way to stream it (like using socket.io?)
I am using: node v0.10.33, [email protected]
UPDATE: Sample code
var app = express();
app.route('/events')
.get(function(req, res, next) {
var configdata = [{config:<10,000 lines of config>}, ... 10,000 configs]
res.json(configdata); // The out of memory error comes here
})
Upvotes: 4
Views: 9793
Reputation: 954
Try to use streams. What you need is a readable stream that produces data on demand. I'll write simplified code here:
var Readable = require('stream').Readable;
var rs = Readable();
rs._read = function () {
// assuming 10000 lines of config fits in memory
rs.push({config:<10,000 lines of config>);
};
rs.pipe(res);
Upvotes: 2
Reputation: 3166
After a lot of try, I finally decided to go with socket.io to send each config file at a time rather than all config files at once. This solved the problem of out of memory which was crashing my server. thanks for all your help
Upvotes: 4
Reputation: 70075
You can try increasing the memory node has available with the --max_old_space_size
flag on the command line.
There may be a more elegant solution. My first reaction was to suggest using res.json()
with a Buffer object rather than trying to send the entire object all in one shot, but then I realize that whatever is converting to JSON will probably want to use the entire object all at once anyway. So you will run out of memory even though you are switching to a stream. Or at least that's what I would expect.
Upvotes: 0