Reputation: 483
I'm reading up on network technology, but there's something that's got me scratching my head. I've read that a popular encoding for sending data across Ethernet is 8B/10B "Gigabit Ethernet".
I've read how the data is packaged up in "frames" which in turn package up "packets" of the data the application needs. Here's where it gets fuzzy. When I write a page of HTML, I set the encoding to Unicode. I understand that that page is packaged in the packet (formatted using the HTTP protocol, etc.)
If the HTML is in Unicode, but the Ethernet encoding is 8B/10B, how do the two encodings coexist? Is the message part of the packet in Unicode while the rest of the frame is 8B/10B?
Thanks for any help!
Upvotes: 1
Views: 702
Reputation: 5635
They really don't have much to do with each other. Ethernet is a "lower level" protocol than the HTTP over which your HTML is sent.
The HTML itself is simply data, and Unicode is a way to encoding characters with bits/bytes.
In contrast, Ethernet is a communications protocol for transfering bits/bytes/packets on a link between devices.
See here: http://en.wikipedia.org/wiki/OSI_model
Ethernet in the OSI 7 layer model is basically layer 2, the data link layer. HTTP and your HTML character encoding are the "Data" layers above layer 4 (which is basically TCP). The abstractions at each layer mean that each layer only has to worry about its job. The layers of 4 and below are responsible for getting your data from point A to point B. Ethernet is part of the "getting data from point A to point B" problem. The layers above that are for figuring what to do with that data. Your Unicode encoding is a "what to do with that data" question.
Upvotes: 1