Reputation: 35
For UTF-16, we can read and convert it to wchar
at the same time. For example,
std::wifstream* file = new std::wifstream(name, ifstream::binary);
locale lo = locale(file->getloc(), new std::codecvt_utf16<wchar_t, 0x10ffff, std::little_endian>);
file->imbue(lo);
How could I do the same for UTF-32 input?
Upvotes: 0
Views: 905
Reputation: 42924
You may want to use a classic C++ pattern of allocating wifstream
on the stack instead of the heap (new
):
std::wifstream* file = new std::wifstream(name, ifstream::binary);
std::wifstream file(name, ifstream::binary);
For the codecvt part, I'd try with std::codecvt_utf16<char32_t>
.
P.S. Note that wchar_t
can have different sizes (16 bits, 32 bits) on different platforms. So it may be better for you to use std::u16string
for UTF-16 and std::u32string
for UTF-32.
Upvotes: 2