Alex44
Alex44

Reputation: 3855

uint8_t Array - Data inside memory

I have a question to a behavior I detect with the gdb.

First I compiled this small program with the gcc on a 64bit machine:

#include <stdio.h>
#include <inttypes.h>

void fun (uint8_t *ar)
{
   uint8_t i;

   for(i = 0; i<4; i++)
   {
      printf("%i\n",*(&ar[0]+i));
   }
}

int main (void)
{
   uint8_t ar[4];

   ar[0] = 0b11001100;
   ar[1] = 0b10101010;
   ar[2] = 0b01010110;
   ar[3] = 0b00110011;

   fun(ar);

   return 0;
}

Then I look with the gdb to the memory of ar:

(gdb) p/t ar
$7 = {11001100, 10101010, 1010110, 110011}
(gdb) x ar
0x7fffffffe360: 00110011010101101010101011001100
(gdb) x 0x7fffffffe360
0x7fffffffe360: 00110011010101101010101011001100
(gdb) x 0x7fffffffe361
0x7fffffffe361: 11111111001100110101011010101010
(gdb) x 0x7fffffffe362
0x7fffffffe362: 01111111111111110011001101010110
(gdb) x 0x7fffffffe363
0x7fffffffe363: 00000000011111111111111100110011

I saw that the array of uint8_t was collect together to an 32 bit field. For the next addresses this will only push to the right.

&ar[0] -> {ar[3],ar[2],ar[1],ar[0]}
&ar[1] -> {xxxx,ar[3],ar[2],ar[1]}
&ar[2] -> {xxxx,xxxx,ar[3],ar[2]}
&ar[3] -> {xxxx,xxxx,xxxx,ar[3]}

It's a bit strange and I want to know: Why this will happen and can I rely on this behavior? Is this only typically for gcc or is this a handling standard?

Upvotes: 0

Views: 1492

Answers (2)

nos
nos

Reputation: 229098

In gdb, x just prints out whatever is in the memory location, regardless of its type in the C code. You're just getting some defaults (or previously used formats) for the width(4 bytes in your case) and format. Do e.g. x/b ar to print the location as bytes. and do help x for more info.

If you print it as a anything other than a byte, endianess of your processor will determine how the memory is interpreted though.

Use p to take the type into account, as in p ar

Upvotes: 1

glglgl
glglgl

Reputation: 91017

It has to do with endianness:

In a x64, and every other little-endian machine, the data of value 0x12345678 is put into memory in the form 78 56 34 12, i. e. with the lowest significant byte first.

The debugger knows that and shows it to you in this way.

Expressed in hex, making your data easier to read, it looks this way:

Your memory is filled with

CC AA 56 33 FF 7F 00

which makes

  • the value at offset 0 3356AACC
  • the value at offset 1 FF3356AA
  • the value at offset 2 7FFF3356
  • the value at offset 3 007FFF33

Upvotes: 1

Related Questions