Reputation: 1
I have a Matlab script that calls a C code which reads my dat file consisting of all my data which of one is the timestamp_low. Running my script on windows I get the correct time value of 4.1472*10^9 but of linux/mac I get the value 1.8447*10^19. Essentially I'm only reading from the file and saving it.
unsigned int timestamp_low = inBytes[0] + (inBytes[1] << 8) +
(inBytes[2] << 16) + (inBytes[3] << 24);
mxSetField(outCell, 0, "timestamp_low", mxCreateDoubleScalar((double)timestamp_low));
Does anyone know if the mex-compiler works differently on different OS for this kind of stuff? I haven't written this code myself, so I'm not super familiar with details. I use it to gather CSI from a WiFi device. I've tried on different Matlab versions and on Mac/Linux and they produce the same (wrong) value.
Upvotes: 0
Views: 65
Reputation: 212969
I suspect you have some Undefined Behaviour here:
unsigned int timestamp_low = inBytes[0] + (inBytes[1] << 8) +
(inBytes[2] << 16) + (inBytes[3] << 24);
(although it's not clear as you haven't told us the type of inBytes
).
Try:
unsigned int timestamp_low = (unsigned int)inBytes[0] + ((unsigned int)inBytes[1] << 8) +
((unsigned int)inBytes[2] << 16) + ((unsigned int)inBytes[3] << 24);
Upvotes: 1