Reputation: 3186
I'm getting a lot of data on my socket.io client side that may or may not be complete data and its also potentially BIG data, so I need to make it efficient. So I have tried my hand at creating a buffer and parsing that into my app. I know there are some streaming/buffering modules available, and I might consider using those if they achieve the goal of being more efficient. Looking forward to seeing your answers and possible arguments on how to best do this.
Note, bandwidth is not my concern as much as how quickly the client side Javascript can render the data into a browser friendly format.
Here is what I've got so far.
function extended_split(str, separator, max) {
var out = [],
index = 0,
next;
while (!max || out.length < max - 1 ) {
next = str.indexOf(separator, index);
if (next === -1) {
break;
}
out.push(str.substring(index, next));
index = next + separator.length;
}
out.push(str.substring(index));
return out;
};
var buffer = '';
// data format "\nOP:ARGS:DATA(could be base64 or 'other' depending on OP)\0";
socket.on('ioSend', function(data) {
data = String.fromCharCode.apply(null, new Uint16Array(data));
buffer = buffer + data;
while(buffer.indexOf('\n') != -1 && extended_split(buffer, '\n', 2)[1].indexOf('\0') != -1)
{
splitted = extended_split(extended_split(buffer, '\n', 2)[1], '\0', 2);
parse = splitted[0];
buffer = splitted[1];
parse = parse.split(':');
// Do stuff with parse here
}
});
Upvotes: 0
Views: 1302
Reputation: 3186
I took another crack at this and dropped the extended split idea and came up with this.
socket.on('ioSend', function(data) { // receive command from socket.io
if (safeBrowser) { // IE < 10 doesn't support Uint16Array
var xdata = new Uint16Array(data);
data = String.fromCharCode.apply(null, xdata);
buffer = buffer + data; // Update the buffer with most recent ioSend data
}
else { // So we have to kludge this in for IE < 10
var xdata = '';
for (var i = 0; i < data.length; i++) {
xdata += String.fromCharCode(data[i]);
}
buffer = buffer + xdata; // Update the buffer with most recent ioSend data
}
var splitA = [];
var splitB = [];
while(buffer.indexOf(d1) != -1 && buffer.indexOf(d2) != -1) // While loop reads buffer until there are no commands left to issue
{
splitA = buffer.split(d2); // Array with rear delimiter
splitB = splitA[0].split(d1);
doParse.call(null, splitB[1]); // This should be an @command
splitB = null;
splitA.shift(); // Shift array
buffer = splitA.join(d2); // Update buffer from shifted array with rear delimiter
}
});
It's really fast in all my unit tests and does the job really well. I'm working on an implementation that doesn't use socket.io as @GeoPhoenix suggested but until then this works good.
Upvotes: 0
Reputation: 7165
Rolling your own buffer builder/parser its ok, but you can spent double the time working and maintaining it, from just getting a production ready script.
Now from my point of view, i would first drop socket.io
for your case, since it just doesn't transmit binary as it should
, there are other modules that transmit binary https://github.com/binaryjs/binaryjs which are better suited for binary transmissions over websocket protocol.
i would also try http://bsonspec.org/ (check implementations for node modules), which encodes your json into binary, this way you could skip the whole problem with building and maintaining the buffer parser/builder.
Upvotes: 1