Reputation: 6294
I have an API that returns a list of objects of my entire database with join. (Yes, everything is needed, it is a cool report :))
When I did not have 1 of the tables, the data was 150MB, and everything was OK. I now have a problem, that I added a big table, (50MB), and it crashes the browser.
Is there a way to get a 200MB response without crashing the browser?
Upvotes: 1
Views: 1227
Reputation: 163240
Assuming that you do need all this data in the browser, and that you're willing to accept the performance implications of working with a large data set...
The problem here likely isn't in the raw size of the data itself, but the format you have it in.
If you dump 200MB into JSON and expect your browser to parse that and create a single object in memory with all that data, you're going to have a bad time. Same is true with XML. Additionally, I suspect that with a 200MB download, you want to show some progress to the user as you load the data... maybe even show some of that data as it's loaded. The solution to all of this is to chunk your data. I'm guessing that your data is largely array-based. Load elements of that array chunk by chunk. Start with 10k records at a time and see where that gets you.
You can still use AJAX to fetch the data, since you will have solved the problem of parsing the responses. However, you might also consider utilizing web sockets to reduce some of the request overhead. (Or, use HTTP/2 where possible.)
I should also point out that there are streaming parsers available for JSON and XML. This requires you to be able to get that data stream (web sockets makes this easy), and depends on your data format whether or not it will be useful to you.
If after chunking your data you still have crashes, then it's time to get cozy with the developer tools and profile your memory usage. There are limits to what you can load in a web page, which vary from browser to browser (and system to system, especially on mobile devices).
2020 Update: Line-delimited (ND-JSON) is the usual way to do this chunking. You can stream the result client-side and parse as you go. No streaming parser necessary, you just need a transform stream to handle the lines.
Upvotes: 5
Reputation: 2040
I don't know if this helps but you can split the data in chunks and make multiple calls one after the current has finished ( maybe passing an index for chunk order ). This way you can prevent future surprises if you must get all of 200 MB data
Normally i would get the most important data first than the other part only when needed ( usually a user action )
Upvotes: 0