Reputation: 371
I would like to read utf-8 test from a .dll string table. something like this
LPWSTR nnW;
LoadStringW(hMod, id, nnW, MAX_PATH);
and after that I would like to convert the LPWSTR nnW
to std::wstring nnWstring
.
I tried in this way:
LPWSTR nnW;
LoadStringW(hMod, id, nnW, MAX_PATH);
const int length = MultiByteToWideChar(CP_UTF8,
0, // no flags required
(LPCSTR)nnW,
-1, // automatically determine length
NULL,
0);
std::wstring nnWstring(length, L'\0');
if (!MultiByteToWideChar(CP_UTF8,
0,
(LPCSTR)nnW,
-1,
&nnWstring[0],
length))
MessageBoxW(NULL, (LPCWSTR)nnWstring.c_str(), L"wstring", MB_OK | MB_ICONERROR);
After that in the MessageBoxW only shows the first letter.
Upvotes: 3
Views: 5777
Reputation: 48019
I would like to read utf-8 test from a .dll string table. something like this
Generally, string tables in Windows are UTF-16. You're trying to put UTF-8 data into one. The UTF-8 data is being treated like "extended" ASCII, so each byte is being expanded to two bytes with zero bytes between them.
You should probably put UTF-16 data in the string table directly.
If you must store UTF-8 data in the resources, you can put it into an RCDATA resource and use the lower-level resource functions to get the data out. Then you'll have to convert from UTF-8 to UTF-16 to store it in a wstring.
Upvotes: 1
Reputation: 283803
No conversion or copying needed.
std::wstring nnWString(MAX_PATH, 0);
nnWString.resize(LoadStringW(hMod, id, &nnWString[0], nnWString.size());
Note: Your original code causes undefined behavior, because it writes using an uninitialized pointer. Surely not what you wanted.
Here's another variation:
Upvotes: 6