Reputation: 52411
I have written a small application which works at some point with binary data. In unit tests, I compare this data with the expected one. When an error occurs, I want the test to display the hexadecimal output such as:
Failure
Expected: string_to_hex(expected, 11)
Which is: "01 43 02 01 00 65 6E 74 FA 3E 17"
To be equal to: string_to_hex(writeBuffer, 11)
Which is: "01 43 02 01 00 00 00 00 98 37 DB"
In order to display that (and to compare binary data in the first place), I used the code from Stack Overflow, slightly modifying it for my needs:
std::string string_to_hex(const std::string& input, size_t len)
{
static const char* const lut = "0123456789ABCDEF";
std::string output;
output.reserve(2 * len);
for (size_t i = 0; i < len; ++i)
{
const unsigned char c = input[i];
output.push_back(lut[c >> 4]);
output.push_back(lut[c & 15]);
}
return output;
}
When checking for memory leaks with valgrind
, I fould a lot of errors such as this one:
Use of uninitialised value of size 8
at 0x11E75A: string_to_hex(std::__cxx11::basic_string, std::allocator > const&, unsigned long)
I'm not sure to understand it. First, everything seems initialized, including, I'm mistaken, output
. Moreover, there is no mention of size 8 in the code; the value of len
varies from test to test, while valgrind
reports the same size 8 every time.
How should I fix this error?
Upvotes: 0
Views: 180
Reputation: 37600
So this is one of the cases where passing a pointer to char
that points to buffer filled with arbitrary binary data into evil implicit constructor of std::string
class was causing string to be truncated to first \0
. Straightforward approach would be to pass a raw pointer but a better solution is to start using array_view
span
or similar utility classes that will provide index validation at least in debug build for both input
and lut
.
Upvotes: 1