Reputation: 113956
How do I pass a UTF-16 char from a C++/CLI function to a .NET function? What types do I use on the C++/CLI side and how do I convert it?
I've currently defined the C++/CLI function as follows:
wchar_t GetCurrentTrackID(); // 'wchar_t' is the C++ unicode char equivalent to .NET's 'char'?
The .NET wrapper is defined as:
System::Char GetCurrentTrackID(); // here, 'char' means UTF-16 char
I'm currently using this to convert it, but when testing it I only get a null character. How do I properly convert a unicode char code to its char
equivalent for .NET?
#pragma managed
return (System::Char)player->GetCurrentTrackID();
Upvotes: 1
Views: 353
Reputation: 941307
They are directly compatible. You can assign a Char to a wchar_t and the other way around without a cast, the compiler will not emit any kind of conversion function call. This is true for many simple value types in C++/CLI, like Boolean vs bool, SByte vs char, Byte vs unsigned char, Int16 vs short, Int32 vs int or long, Int64 vs long long, Single vs float, Double vs double. Plus their unsigned varieties. The compiler will treat them as aliases since they have the exact same binary representation.
But not strings or arrays, they are classes with a non-trivial implementation that doesn't match their native versions at all.
Upvotes: 2