Reputation: 1508
I'm not used to C++, so bear with me...
Two bytes are read from a device and gets in a buffer. It is then to be printed.
The code below is supposed to return the string "0x204D" However, it returns "0x M" which in hex is 30 78 20 4d
So the hex is not decoded to ascii.
void vito_unit::decodeAsRaw(unsigned char *buffer, int bufferLen)
{
std::stringstream *decodedClearText;
decodedClearText = new std::stringstream;
*decodedClearText << "0x" << std::hex;
for (int i=0; i<bufferLen; i++) {
*decodedClearText << buffer[i];
}
setValue(decodedClearText->str());
}
How should it be done?
Upvotes: 3
Views: 5817
Reputation: 385144
This has nothing to do with std::hex
.
When you stream a [signed/unsigned] char
, its ASCII representation is used, because that is usually what is expected of char
s.
You can stream a number instead by converting it to int
. Then the feature that renders numbers in hexadecimal notation (i.e. std::hex
) will be triggered.
You should also fix that memory leak and unnecessary dynamic allocation:
void vito_unit::decodeAsRaw(unsigned char const* const buffer, int const bufferLen)
{
std::stringstream decodedClearText;
decodedClearText << "0x" << std::hex;
for (int i = 0; i < bufferLen; i++) {
decodedClearText << +buffer[i];
}
setValue(decodedClearText.str());
}
The unary "+" performs an integral promotion to int
.
Upvotes: 5
Reputation: 1508
The hint from Bo Persson was what I needed.
for (int i=0; i<bufferLen; i++) {
*decodedClearText << (int)buffer[i];
}
did the trick.
Upvotes: 3
Reputation: 126
buffer[i]
is of type unsigned char
and is thus printed as a character instead of its hexadecimal representation. You can cast the value to an unsigned int
to avoid that.
void vito_unit::decodeAsRaw(unsigned char *buffer, int bufferLen)
{
std::stringstream *decodedClearText;
decodedClearText = new std::stringstream;
*decodedClearText << "0x" << std::hex;
for (int i=0; i<bufferLen; i++) {
*decodedClearText << (unsigned int) buffer[i];
}
setValue(decodedClearText->str());
}
Upvotes: 4