Reputation: 374
For a project I am working on I use magic numbers. This macro is used to define one
#define BOOTSIGNATURE 0xAA55
However, when I HEXDUMP the resulting file, where it should say AA55 it says 55 AA.
Is GCC Mixing up endianness, or am I? This project is for the x86 processor. AA 55 needs to be in that specific order. I could just exchange the bytes, but I am curious as to why GCC does this.
Upvotes: 0
Views: 404
Reputation: 11694
If you want to write portable code, you'd make use of functions to force a given byte order. For example, these functions allow you to convert a 16-bit value in native host-byte-order to either big-endian or little-endian, depending on what order you need it in your file.
#define BOOTSIGNATURE 0xAA55
struct bootheader {
uint16_t signature_be;
} header;
header.signature_be = htobe16( BOOTSIGNATURE);
I like using a _le
or _be
suffix on variables and structure elements with non-host-byte-order.
Since you need big-endian, you can use htons()
from arpa/inet.h
, but I'm not a big fan of that method. I don't think the name is as clear as htobe16
, and you don't have functions for converting to/from little-endian byte order.
Upvotes: 0
Reputation: 374
Simplest sollution: use the integer form of the hex value so the corresponding binary will result in the same magic number! In this case that would be 43605
Upvotes: -2
Reputation: 400462
Preprocessor macros don't show up in the compiled object files -- they're not seen at all by the compiler. If you just had that #define
and never used it anywhere, there'd be no trace of it.
If you used it in code somewhere, it would likely show up as a constant in an instruction (e.g. to load a constant into a register or memory). If you used it to initialize static data, it would show up as a constant in the data segment:
// Global variable definition
#define BOOTSIGNATURE 0xAA55
uint16_t my_global = BOOTSIGNATURE;
If you compile the above and look at the data segment, it looks like this:
$ gcc -c test.c
$ objdump -s test.o
[...]
Contents of section .data:
0000 55aa0000 U...
As you can see, the two bytes are stored in memory in little-endian order 55 AA
(the leading 0000
is the segment offset in hex).
If you want to control the endianness of the data, then store it as an explicit byte array:
uint8_t my_global[] = {0xAA, 0x55};
This will always store the bytes in the order specified.
Upvotes: 2
Reputation: 613302
0xAA55
is an int
and so you are subject to the endianness of your machine. I would store this as a char array:
const unsigned char BOOTSIGNATURE[] = {0xAA, 0x55};
Upvotes: 2