user1890597
user1890597

Reputation: 53

Ruby - How to represent message length as 2 binary bytes

I'm using Ruby and I'm communicating with a network endpoint that requires the formatting of a 'header' prior to sending the message itself.

The first field in the header must be the message length which is defined as a 2 binary byte message length in network byte order.

For example, my message is 1024 in length. How do I represent 1024 as binary two-bytes?

Upvotes: 5

Views: 1833

Answers (2)

mu is too short
mu is too short

Reputation: 434585

The standard tools for byte wrangling in Ruby (and Perl and Python and ...) are pack and unpack. Ruby's pack is in Array. You have a length that should be two bytes long and in network byte order, that sounds like a job for the n format specifier:

n | Integer | 16-bit unsigned, network (big-endian) byte order

So if the length is in length, you'd get your two bytes thusly:

two_bytes = [ length ].pack('n')

If you need to do the opposite, have a look at String#unpack:

length = two_bytes.unpack('n').first

Upvotes: 5

Chris Heald
Chris Heald

Reputation: 62638

See Array#pack.

[1024].pack("n")

This packs the number as the network-order byte sequence \x04\x00.

The way this works is that each byte is 8 binary bits. 1024 in binary is 10000000000. If we break this up into octets of 8 (8 bits per byte), we get: 00000100 00000000.

A byte can represent (2 states) ^ (8 positions) = 256 unique values. However, since we don't have 256 ascii-printable characters, we visually represent bytes as hexadecimal pairs, since a hexadecimal digit can represent 16 different values and 16 * 16 = 256. Thus, we can take the first byte, 00000100 and break it into two hexadecimal quads as 0000 0100. Translating binary to hex gives us 0x04. The second byte is trivial, as 0000 0000 is 0x00. This gives us our hexadecimal representation of the two-byte string.

It's worth noting that because you are constrained to a 2-byte (16-bit) header, you are limited to a maximum value of 11111111 11111111, or 2^16 - 1 = 65535 bytes. Any message larger than that cannot accurately represent its length in two bytes.

Upvotes: 5

Related Questions