Reputation: 135
Recently I've learned some Windows socket programming to achieve some socket connection stuff.
Inside the code we use some functions like htonl(), htons() to convert our data from host byte order to network byte order, or so called big-endian, since in some machines data is stored as little-endian, like Intel's CPU as I know.
But what I confused is that doesn't the really important thing is the bit order instead of byte?
Since bit is the minimum unit that computers used, not byte. Let's say we want to pass an u_short u=18 to another machine by a machine use little-endian.
In our machine, u's first byte, the least significant one is 2 and second byte, the most significant one is 1, let me express this by [2][1].
So we need to call htons(u) first to convert [2][1] into [1][2] and send it to network.
The remote machine will receive 2 first, but 2 is expressed by a sequence of bits which the machine actually receive are 0000-0010, how does the machine know this byte is 2? Won't it think 0000-0010 represents 64?(0100-0000) Do all machines store bytes in the same way?
Upvotes: 3
Views: 238
Reputation: 20396
"Bit Ordering" is specified by the protocols used to transfer data from the hardware (like your network card/port) to host memory.
Technically, this applies in many, many cases. Like, you might wonder the same question if a Harddrive on a machine that stores bits in one order 01234567
is transferred to a machine that stores bits in the opposite order 76543210
, whether it'll read the data correctly or not.
But the simple answer is that it always reads correctly, because the protocols used to map the hard drives to the system bus specify the exact ordering of the bits as they are "presented" to host memory.
Network cards and networking hardware have a similar behavior: They have a standardized "bit ordering" they use in hardware, and as part of the hardware protocol, it "presents" those bits to the Host in whatever form the host expects them.
"Byte Ordering", of course, is a separate thing, and more difficult to deal with, because network hardware and storage hardware only recognize "streams of bytes" (Yes, I'm oversimplifying), and don't much care for what those bytes actually mean.
But in terms of "Bit Ordering", unless you're writing code "On the Metal", so to speak, you don't need to think about it.
Upvotes: 2