stiller_leser
stiller_leser

Reputation: 1572

JavaScript performance when handling large arrays

I'm currently writing an image editing program in JavaScript. I've chosen JS because I wanted to learn more about it. The average image I'm handling is about 3000 x 4000 pixels big. When converted into imageData (for editing the pixels), that adds up to 48000000 values I have to deal with. That's why I decided to introduce webworkers and let them edit only the n-th-part of the array. Pretending that I have ten webworkers, each worker will have to deal with 4800000 values. To be able to use webworkers I'm dividing the big array through the amount of threads I've chosen. The piece of code I use looks like this:

while(pixelArray.length > 0){
    cD.pixelsSliced.push(pixelArray.splice(0, chunks)); //Chop off a chunk from the picture array
}

Later after the workers have done something with the array, they save it into another array. Each worker has an ID and saves his part in the mentioned array at the place of his id (to make sure the arrays stay in the correct order). I use $.map to concat that array (looking like [[1231][123213123][213123123]] into one big array [231231231413431] from which I will later create the imageData I need. It looks like that:

cD.newPixels = jQuery.map(pixelsnew, function(n){
    return n;
});

After this array (cD.pixelsSliced) is created I create imageData and copy this image into the imageData-Object like so:

cD.imageData = cD.context.createImageData(cD.width, cD.height);
for(var i = 0; i < cD.imageData.data.length; i +=4){ //Build imageData
    cD.imageData.data[i + eD.offset["r"]] = cD.newPixels[i + eD.offset["r"]];
    cD.imageData.data[i + eD.offset["g"]] = cD.newPixels[i + eD.offset["g"]];
    cD.imageData.data[i + eD.offset["b"]] = cD.newPixels[i + eD.offset["b"]];
    cD.imageData.data[i + eD.offset["a"]] = cD.newPixels[i + eD.offset["a"]];
}

Now I do realize that I'm dealing with a huge amount of data here and that I probably shouldn't use the browser for image editing, but a different language (I'm using Java in uni). However I was wondering if you have any tips regarding the performance, because frankly I was pretty surprised when I tried a big image for the first time. I didn't figure, that it would take "that" long to load the image (First peace of code). Firefox actually thinks that my script is broken. The other two parts of codes are those ones which I found to slow down the script (which is normal). So yeah I would be thankful for any tips.

Thank you

Upvotes: 3

Views: 2506

Answers (1)

lejahmie
lejahmie

Reputation: 18253

I would recommend looking into Transferable Objects instead of Structured Cloning when using Web Workers. Web Workers normally use structured cloning to pass objects, in other words a copy is made. This can take loads of time for large objects such as large images.

When using Transferable Objects data is transferred from one context to another. In other words, zero-copy, which should improve the performance of sending data to a Worker.

For more info check: http://www.w3.org/html/wg/drafts/html/master/infrastructure.html#transferable-objects

Also, another idea perhaps would be to move the task of splitting and butting back the large array to a web worker. Just brainstorming here, but, perhaps you could first spaw a Web Worker, let's call it Mother Worker. This worker could split the array and then spawn 10 other child workers that performs the heavy duty task and sends back to their mother.

The mother finally puts it all back together and send back to main application.

Upvotes: 2

Related Questions