Reputation: 42083
Lately I've been asked to write a function that reads the binary file into the std::vector<BYTE>
where BYTE
is an unsigned char
. Quite quickly I came with something like this:
#include <fstream>
#include <vector>
typedef unsigned char BYTE;
std::vector<BYTE> readFile(const char* filename)
{
// open the file:
std::streampos fileSize;
std::ifstream file(filename, std::ios::binary);
// get its size:
file.seekg(0, std::ios::end);
fileSize = file.tellg();
file.seekg(0, std::ios::beg);
// read the data:
std::vector<BYTE> fileData(fileSize);
file.read((char*) &fileData[0], fileSize);
return fileData;
}
which seems to be unnecessarily complicated and the explicit cast to char*
that I was forced to use while calling file.read
doesn't make me feel any better about it.
Another option is to use std::istreambuf_iterator
:
std::vector<BYTE> readFile(const char* filename)
{
// open the file:
std::ifstream file(filename, std::ios::binary);
// read the data:
return std::vector<BYTE>((std::istreambuf_iterator<char>(file)),
std::istreambuf_iterator<char>());
}
which is pretty simple and short, but still I have to use the std::istreambuf_iterator<char>
even when I'm reading into std::vector<unsigned char>
.
The last option that seems to be perfectly straightforward is to use std::basic_ifstream<BYTE>
, which kinda expresses it explicitly that "I want an input file stream and I want to use it to read BYTE
s":
std::vector<BYTE> readFile(const char* filename)
{
// open the file:
std::basic_ifstream<BYTE> file(filename, std::ios::binary);
// read the data:
return std::vector<BYTE>((std::istreambuf_iterator<BYTE>(file)),
std::istreambuf_iterator<BYTE>());
}
but I'm not sure whether basic_ifstream
is an appropriate choice in this case.
What is the best way of reading a binary file into the vector
? I'd also like to know what's happening "behind the scene" and what are the possible problems I might encounter (apart from stream not being opened properly which might be avoided by simple is_open
check).
Is there any good reason why one would prefer to use std::istreambuf_iterator
here?
(the only advantage that I can see is simplicity)
Upvotes: 110
Views: 124318
Reputation: 448
This works for me (sample shown without error checking):
std::vector<unsigned char> bytes;
FILE* fp = nullptr;
fopen_s(&fp, PATH, "rb");
auto size = size_t(std::filesystem::file_size(sFilePath));
bytes.resize(size, 0);
auto success = fread(bytes.data( ), sizeof(unsigned char), size, fp);
fclose(fp);
Upvotes: 0
Reputation: 1766
The class below extends vector with a binary file load and save. I returned to this question multiple times already, so this is the code for my next return - and for all others who will be looking for the binary file save method next. :)
#include <cinttypes>
#include <fstream>
#include <vector>
// The class offers entire file content read/write in single operation
class BinaryFileVector : public std::vector<uint8_t>
{
public:
using std::vector<uint8_t>::vector;
bool loadFromFile(const char *fileName) noexcept
{
// Try to open a file specified by its name
std::ifstream file(fileName, std::ios::in | std::ios::binary);
if (!file.is_open() || file.bad())
return false;
// Clear whitespace removal flag
file.unsetf(std::ios::skipws);
// Determine size of the file
file.seekg(0, std::ios_base::end);
size_t fileSize = file.tellg();
file.seekg(0, std::ios_base::beg);
// Discard previous vector content
resize(0);
reserve(0);
shrink_to_fit();
// Order to prealocate memory to avoid unnecessary reallocations due to vector growth
reserve(fileSize);
// Read entire file content into prealocated vector memory
insert(begin(),
std::istream_iterator<uint8_t>(file),
std::istream_iterator<uint8_t>());
// Make sure entire content is loaded
return size() == fileSize;
}
bool saveToFile(const char *fileName) const noexcept
{
// Write entire vector content into a file specified by its name
std::ofstream file(fileName, std::ios::out | std::ios::binary);
try {
file.write((const char *) data(), size());
}
catch (...) {
return false;
}
// Determine number of bytes successfully stored in file
size_t fileSize = file.tellp();
return size() == fileSize;
}
};
Usage example
#include <iostream>
int main()
{
BinaryFileVector binaryFileVector;
if (!binaryFileVector.loadFromFile("data.bin")) {
std::cout << "Failed to read a file." << std::endl;
return 0;
}
if (!binaryFileVector.saveToFile("copy.bin")) {
std::cout << "Failed to write a file." << std::endl;
return 0;
}
std::cout << "Success." << std::endl;
return 0;
}
Upvotes: 0
Reputation: 52201
std::ifstream stream("mona-lisa.raw", std::ios::in | std::ios::binary);
std::vector<uint8_t> contents((std::istreambuf_iterator<char>(stream)), std::istreambuf_iterator<char>());
for(auto i: contents) {
int value = i;
std::cout << "data: " << value << std::endl;
}
std::cout << "file size: " << contents.size() << std::endl;
Upvotes: 32
Reputation: 102246
When testing for performance, I would include a test case for:
std::vector<BYTE> readFile(const char* filename)
{
// open the file:
std::ifstream file(filename, std::ios::binary);
// Stop eating new lines in binary mode!!!
file.unsetf(std::ios::skipws);
// get its size:
std::streampos fileSize;
file.seekg(0, std::ios::end);
fileSize = file.tellg();
file.seekg(0, std::ios::beg);
// reserve capacity
std::vector<BYTE> vec;
vec.reserve(fileSize);
// read the data:
vec.insert(vec.begin(),
std::istream_iterator<BYTE>(file),
std::istream_iterator<BYTE>());
return vec;
}
My thinking is that the constructor of Method 1 touches the elements in the vector
, and then the read
touches each element again.
Method 2 and Method 3 look most promising, but could suffer one or more resize
's. Hence the reason to reserve
before reading or inserting.
I would also test with std::copy
:
...
std::vector<byte> vec;
vec.reserve(fileSize);
std::copy(std::istream_iterator<BYTE>(file),
std::istream_iterator<BYTE>(),
std::back_inserter(vec));
In the end, I think the best solution will avoid operator >>
from istream_iterator
(and all the overhead and goodness from operator >>
trying to interpret binary data). But I don't know what to use that allows you to directly copy the data into the vector.
Finally, my testing with binary data is showing ios::binary
is not being honored. Hence the reason for noskipws
from <iomanip>
.
Upvotes: 66
Reputation: 136256
Since you are loading the entire file into memory the most optimal version is to map the file into memory. This is because the kernel loads the file into kernel page cache anyway and by mapping the file you just expose those pages in the cache into your process. Also known as zero-copy.
When you use std::vector<>
it copies the data from the kernel page cache into std::vector<>
which is unnecessary when you just want to read the file.
Also, when passing two input iterators to std::vector<>
it grows its buffer while reading because it does not know the file size. When resizing std::vector<>
to the file size first it needlessly zeroes out its contents because it is going to be overwritten with file data anyway. Both of the methods are sub-optimal in terms of space and time.
Upvotes: 8
Reputation: 129374
I would have thought that the first method, using the size and using stream::read()
would be the most efficient. The "cost" of casting to char *
is most likely zero - casts of this kind simply tell the compiler that "Hey, I know you think this is a different type, but I really want this type here...", and does not add any extra instrucitons - if you wish to confirm this, try reading the file into a char array, and compare the actual assembler code. Aside from a little bit of extra work to figure out the address of the buffer inside the vector, there shouldn't be any difference.
As always, the only way to tell for sure IN YOUR CASE what is the most efficient is to measure it. "Asking on the internet" is not proof.
Upvotes: 3