Jens Ehrlich
Jens Ehrlich

Reputation: 873

large JSON could not be handled by browser

We have a Apache server that provides a website. The website creates a GET-Request, that runs a C++ program on the server. The program creates a 3D-Scene and answers the Get-Request with a json containing the scene. The scene is then rendered in the browser using WebGL.

This works perfectly fine for small scenes. Chrome throws an error when the json is greater than ~125 mb. Firefox can handle jsons up to ~260 mb.

I create the Get-Request using jquery:

BP2011D1.ServerProxy.prototype.loadMesh = function(requestParameter, callbackOnSuccess,   callbackOnError)
{   
$.ajax({
    type: "GET",
    url: this.getServerURL() + "/cgi-bin/" + this._treemapDirectory + "/hpi_bp2011_app_fcgi",
    data: requestParameter + "&functionName=getMesh",
    dataType: "json",
    success: callbackOnSuccess.execute,
    error: callbackOnError.execute
});
};

For large jsons the callbackOnError is executed, so the json seems to be invalid.

I know that the json should be perfectly valid.

I think that the browser cannot handle a big json or a big string. He clips some of the characters at the end, so the missing brackets make the json invalid.

Is there a way to handle the problem? I need to handle a json up to 800 mb.

Upvotes: 2

Views: 6705

Answers (3)

user128511
user128511

Reputation:

You could try using a more compact format

http://code.google.com/p/webgl-loader/

You could also roll your own format and download the large parts directly in binary by using binary XHR

http://www.html5rocks.com/en/tutorials/file/xhr2/

Upvotes: 2

Joseph
Joseph

Reputation: 119847

Download them by chunks (by batches of several MB) via AJAX. Assuming that you have an advanced browser (due to WebGL), you can use web workers like "threads" to parse them in the background by batch then add them to a main object for use.

For every chunk:

download -> create worker -> parse JSON in worker -> add to main object

Even if it circumvented the 800MB JSON parsing issue, I don't know the effects of an 800MB object though. It would still be heavy on the browser heap.

Upvotes: 2

Herbert
Herbert

Reputation: 5645

You need to identify what creates the problem:

  1. fetching json that big
  2. evaluating json that big (eval())
  3. asking javascript to store this

I assume you fetch the json by xhr, in the first 2 cases you could try creating some pagination where you you add a GET-parameter part={0,1,2,3,4,5...}, allowing the browser to fetch a huge json in multiple xhr requests (by implementing this server-side). This would require the server to split it, and the browser to merge it:

{a:1, b:5} - split -> {a:1} and {b:5} - merge -> {a:1, b:5}

or:

[1, 5] - split -> [1] and [5] - merge -> [1, 5]

please understand that while doing this, you need to find a good place to split and merge, i.e. in this case:

{small: <1mb object>, huge:<799mb object>}

Or you might decide to just fetch string and split and merge it.

Upvotes: 2

Related Questions