Reputation: 121
I've been struggling for a while to get my messages framed correctly between my NodeJS server and my erlang gen_tcp server. I was using {packet,line} successfully until I had to send large data messages and needed to switch to message size framing.
I set gen_tcp to {packet,2}
and I'm using the library from: https://github.com/davedoesdev/frame-stream for the NodeJS tcp decode side. It is ALSO set to packet size option 2 and I have tried packet size option 4.
I saw for any messages with a length under 127 characters this setup works well, but any messages longer than this has a problem.
I ran a test by sending longer and longer messages from gen_tcp and then reading out the first four bytes received on the NodeJS side:
on message 127: HEADER: 0 0 0 127 Frame length 127
on message 128: HEADER: 0 0 0 239 <----- This should be 128 Frame length 239 <----- This should be 128
Theories:
Data from wireshark shows the following:
The header bytes are encoded properly by gen_tcp past 128 characters since the hex values proceed as follows:
[00][7e][...] (126 length)
[00][7f][...] (127 length)
[00][80][...] (128 length)
[00][81][...] (129 length)
So it must be that the error lies when the library on the NodeJS side calls the Node readUInt16BE(0) or readUInt32BE(0) functions. But I checked the endieness and both are big-endian.
If the header bytes are [A,B] then, in binary, this error occurs after [00000000 01111111]
In other words, readUInt16BE(0) reads [000000000 10000000] as 0xef ? which is not even an endian option...?
Thank you for any help in how to solve this.
Kind Regards
Dale
Upvotes: 0
Views: 78
Reputation: 121
I figured it out, the problem was caused by setting the socket to receive on UTF-8 encoding which supports ascii up to 127.
Dont do this: socket.setEncoding('utf8').
It seems obvious now but that one line of code is hard to spot.
Upvotes: 1