Reputation: 105
I have a function in my c++ application that needs an integer as an input. Sadly this integer is only available in form of an usigned char array, which inclines me to do this:
unsigned char c[4] = {'1','2','3','4'};
void myFuncThatBadlyNeedsInts(int i)
//compares some memory value(which is an int) with anotherone...
myFuncThatBadlyNeedsInts((int)c);
This gives me an error, which tells me that this is not allowed. But if i decide to get tricky and do this:
myFuncThatBadlyNeedsInts(*((int*)&c));
Now the program goes about and gives me always the result i want. My question is: Why is there a diffrence in the result of the two casts? Shouldn't they both do the same, with the diffrence i have two unneccessary pointers in the process?
Help or the guidance to an alredy existing answer to my qustion is much appreciated.
EDIT (since i can't comment): The need for this indeed silly conversion is inheritated from a project which compares a specific memory location (as an int) with a DWORD wich is retrived from a FGPA and comes as an array. The DWORD gets read in the end as one hex-number. I'll try to get permission to change this and THANK YOU ALL for the quick responses. I really didn't get the part of this program nor did I understand why it worked like this in the first place. Now I know someone got lucky
P.S.: Since im new here and this my first qustion please let me know what other specifics you might need or just edit my newby misshabits away.
Upvotes: 0
Views: 106
Reputation: 409166
When you do myFuncThatBadlyNeedsInts((int)c)
the compiler first decay the array c
to a pointer to the first element, i.e. &c[0]
, you then cast this pointer to an int
and pass that to the function.
When you do *((int*)&c)
you take the address of the array (of type int (*)[4]
) and tell the compiler that it's a pointer to an int
(which is not correct) and then dereference that (incorrect) int*
.
So both calls are actually incorrect. The casting just silences the compiler.
If you want to treat the four bytes of the array as a single 32-bit word, there are ways to do it, but they all breaks the strict aliasing rule.
The simplest way is very close to what you have now, and is done with casting. Using C-casting you cast the pointer that c
decays to as a pointer to int
and dereference that:
myFuncThatBadlyNeedsInts(*(int*)c);
Note that this is not the same thing as either of your attempts.
The second way is to use a union:
union my_union
{
char bytes[sizeof(int)];
int integer;
};
Then copy the data to your unions bytes
member, and read out the integer
.
Upvotes: 1
Reputation: 7476
In the first case you are trying to cast an char array to an int - this is obviously meaningless in that an list of characters is quite different to an int.
In The second case you first take the address of the array - the & operator gives you a character pointer to the first element of the array.
Specifically the type of &c
is unsigned char *
- it is legal (although dangerous) to cast between pointer types thus the cast from unsigned char *
to int *
is legal.
Then you dereference the pointer and get the integer that is at this spot which is probably some nasty (meaningless) number derived from the first couple of characters in the string which are those bytes.
So you second solution doesn't convert from char[]
to int[]
which is presumably what you want, instead it give you an integer representation of the first bytes of the char array.
Upvotes: 0
Reputation: 1384
In the second case you get pointer from unsigned char than cast it to integer, so in fact you always use your uchar and 3 bytes just after (in this case whole array c). Because of sizeof int is 4 (usually, but not always), and size of uchar is only 1. So don't do this unless you like to shoot yourself in leg.
To be honest I don't really understand what you are going to achive in this example
Upvotes: 0