M2tM
M2tM

Reputation: 4505

C++20 with u8, char8_t and std::string

C++11 brought us the u8 prefix for UTF-8 literals and I thought that was pretty cool a few years ago and peppered my code with things like this:

std::string myString = u8"●";

This is all fine and good, but the issue comes up in C++20 it doesn't seem to compile anymore because u8 creates a char8_t* and this is incompatible with std::string which just uses char.

Should I be creating a new utf8string? What's the consistent and correct way to do this kind of thing in a C++20 world where we have more explicit types that don't really match with the standard std::string?

Upvotes: 59

Views: 24897

Answers (6)

Martini Bianco
Martini Bianco

Reputation: 1815

Another way to use u8 literals as const char*, would be a user-defined literal (see https://en.cppreference.com/w/cpp/language/user_literal):

std::string operator"" S(const char8_t* str, std::size_t) {
    return reinterpret_cast< const char* >(str);
}
char const* operator"" C(const char8_t* str, std::size_t) {
    return reinterpret_cast< const char* >(str);
}

Usage: Then it can be used like this:

std::string myString = u8"●"S;


SetConsoleOutputCP(CP_UTF8);
std::cout << u8"Привет, мир!"C << std::endl;

Explanation

The code above defines two user-defined literals u8"…"S and u8"…"C (remember: the literal u8"…" in C++20 is of type const char8_t*). The S literal created a std::string and the C literal creates a const char *.

That means all literals of the form u8"…"C can be used like "…" literals, while all literals of the form u8"…"S can be used like "…"s literals.

PS: I'm not sure, if it is allowed to define literals that do not start with underscore "_". But the code ran without a problem when I tried it in Visual Studio. But all examples in cppreference are with underscore.

Upvotes: 5

Jack Heeley
Jack Heeley

Reputation: 51

It currently looks like utf8 everywhere advocates have been thrown under the bus, with C++20 offering yet another flawed incomplete option to consider when deciding how to deal with character encoding for portable code. char8_t further muddies some already very dirty water. The best I've been able to come up with as a stop gap with msvc optionPreview - Features from the Latest C++ Working Draft (/std:c++latest) is this...

#if defined(__cpp_char8_t)
template<typename T>
const char* u8Cpp20(T&& t) noexcept 
{ 
#pragma warning (disable: 26490)
   return reinterpret_cast<const char*>(t);
#pragma warning (default: 26490)
}
   #define U8(x) u8Cpp20(u8##x)
#else
   #define U8(x) u8##x
#endif

It is ugly, inefficient and annoying. But it allows replacing all u8"" with U8"" in legacy 'utf8 everywhere' code. I plan to shun char8_t until the offering is more coherent and complete (or forever). We should wait and see what C++20 finally settles on. At the moment char8_t is a huge disappointment.

If anyone's interested, I've posted an open source example of my own utf8 everywhere response on github (for the visual studio community). https://github.com/JackHeeley/App3Dev

Upvotes: 5

NiceL
NiceL

Reputation: 21

It may not be convenient, but you use this: (const char*)u8"こんにちは"

Or make 2 functions with arguments "const char*" and "const char8_t*"

Upvotes: 2

vitaut
vitaut

Reputation: 55594

Should I be creating a new utf8string?

No, C++20 adds std::u8string. However, I would recommend using std::string instead because char8_t is poorly supported in the standard and not supported by any system APIs at all (and will likely never be because of compatibility reasons). On most platforms normal char strings are already UTF-8 and on Windows with MSVC you can compile with /utf-8 which will give you portable Unicode support on major operating systems.

For example, you cannot even write a Hello World program using u8 strings in C++20 (https://godbolt.org/z/E6rvj5):

std::cout << u8"Hello, world!\n"; // won't compile in C++20

On Windows with MSVC and pre-C++20 the situation is even worse because u8 strings may be silently corrupted. For example:

std::cout << "Привет, мир!\n";

will produce valid UTF-8 that may or may not be displayed in the console depending on its current code page while

std::cout << u8"Привет, мир!\n";

will almost definitely give you an invalid result such as ╨а╤Я╨б╨В╨а╤С╨а╨Ж╨а┬╡╨бтАЪ, ╨а╤Ш╨а╤С╨б╨В!.

Upvotes: 28

lubgr
lubgr

Reputation: 38287

Should I be creating a new utf8string?

No, it's already there. P0482 does not only propose char8_t, but also a new specialization of std::basic_string for char8_t character types named std::u8string. So this already compiles with clang and libc++ from trunk:

const std::u8string str = u8"●";

The fact that std::string construction from a u8-literal breaks is unfortunate. From the proposal:

This proposal does not specify any backward compatibility features other than to retain interfaces that it deprecates. The author believes such features are necessary, but that a single set of such features would unnecessarily compromise the goals of this proposal. Rather, the expectation is that implementations will provide options to enable more fine grained compatibility features.

But I guess most of such initialization as above should be grep-able or be subject to some automatic clang tooling fixes.

Upvotes: 23

Fabio Fracassi
Fabio Fracassi

Reputation: 3890

In addition to @lubgr's answer, the paper char8_t backward compatibility remediation (P1423) discusses several ways how to make std::string with char8_t character arrays.

Basically the idea is that you can cast the u8 char array into a "normal" char array to get the same behaviour as C++17 and before, you just have to be a bit more explicit. The paper discusses various ways to do this.

The most simple (but not fully zero overhead, unless you add more overloads) method that fits your usecase is probably the last one, i.e. introduce explicit conversion functions:

std::string from_u8string(const std::string &s) {
  return s;
}
std::string from_u8string(std::string &&s) {
  return std::move(s);
}
#if defined(__cpp_lib_char8_t)
std::string from_u8string(const std::u8string &s) {
  return std::string(s.begin(), s.end());
}
#endif

Upvotes: 32

Related Questions