Reputation: 191
I have a simple vector of ints, and I want to write it into a binary file. For example:
#include <fstream>
#include <vector>
int main () {
std::vector<uint32_t> myVector{5, 10, 15, 20 };
// write vector to bin file
std::ofstream outfile("./binary_ints.data", std::ios_base::binary|std::ios::trunc);
std::copy(myVector.begin(), myVector.end(), std::ostreambuf_iterator<char>(outfile));
outfile.close();
return 0;
}
Then, if I inspect the file "binary_ints.data" in hex mode, I have this:
00000000: 050a 0f14 0a
Thats ok!.
However, If myVector has this data:
std::vector<uint32_t> myVector{3231748228};
Then, the hex stored is weird:
00000000: 840a
84 in Hex doesn't match with the Int 3231748228.
What's happening here?
Thanks.
Upvotes: 0
Views: 152
Reputation: 840
The problem is that each value in your std::vector<uint32_t>
is interpreted as char
during your std::copy ()
invocation. 3231748228
is represented in hex as C0A09084
. std::copy ()
takes uint32_t
value, truncates it to single byte which is 0x84
on Little-endian processor. After writing byte 0x84
in file byte 0x0a
is added which corresponds to new line character.
A possible solution is to use ofstream::write()
instead of std::copy ()
:
#include <fstream>
#include <vector>
int main () {
std::vector<uint32_t> myVector{3231748228 };
// write vector to bin file
std::ofstream outfile("./binary_ints.data", std::ios_base::binary|std::ios::trunc);
outfile.write (
(char*)(myVector.data ()),
myVector.size () * sizeof (decltype (myVector)::value_type));
outfile.close();
return 0;
}
Note the use of decltype ()
. The same effect may be achivied by just writing sizeof (uint32_t)
, but with decltype ()
you may be sure that the code remains correct even if you change myVector
value type.
Upvotes: 3