Reputation: 44064
Is there any practical difference between WCHAR
and wchar_t
?
Upvotes: 25
Views: 22660
Reputation: 21309
Official Microsoft docs are here: https://learn.microsoft.com/en-us/windows/win32/winprog/windows-data-types
WCHAR A 16-bit Unicode character. For more information, see Character Sets Used By Fonts.
This type is declared in WinNT.h as follows:
typedef wchar_t WCHAR;
Upvotes: 0
Reputation: 21
typedef wchar_t WCHAR; // wc, 16-bit UNICODE character
this is defined in winnt.h.So originally it is wchar_t WCHAR is abbrivation to it .You can used wcscat,wcslen etc. functions for both
Upvotes: 0
Reputation: 33998
Does anybody know how old WCHAR
is? I would imagine it dates to at least Windows NT 3.1. I'd speculate that when Microsoft started using WCHAR
in the Windows headers, wchar_t
was not defined in either the C or C++ standard. Please correct me if I'm wrong.
Microsoft is in the unenviable position of having to support declarations and headers that a) must work in either C and C++ ; b) compile under very different architectures (i86, MIPS, PowerPC, Alpha, ...) and c) must be backwards-compatible with source code written for 15+ year old compilers. Plus, any breaking changes and thousands of books, reference manuals, online documentation, etc published over the last two decades would suddenly become WRONG.
WCHAR
is an interface—once it became published it was written in stone, even if it's not needed for new code.
Upvotes: 5
Reputation: 247969
wchar_t
is a distinct type, defined by the C++ standard.
WCHAR
is nonstandard, and as far as I know, exists only on Windows. However, it is simply a typedef
(or possibly a macro) for wchar_t
, so it makes no practical difference.
Older versions of MSVC did not have wchar_t
as a first-class type—instead it was simply a typedef
for short
Most likely, Microsoft introduced WCHAR
to represent a "wide character type" across any compiler version, whether or not wchar_t
existed as a native type.
You should use wchar_t
in your code though. That's what it's for.
Upvotes: 18
Reputation: 754725
Practically speaking there isn't much difference. They both represent the same underlying type. A 2 byte width value.
LibT will be on shortly to give you the real insane differences between the various platforms an the C++ standard :)
Upvotes: 0
Reputation: 545588
Well, one practical difference would be that WCHAR
doesn't exist on my platform. For Windows only (and with no intention of ever porting the program to another platform) and with the necessary headers included, it's the same (since WCHAR
is just a typedef
).
Upvotes: 13