Adrian Ratnapala
Adrian Ratnapala

Reputation: 5713

Why both UNICODE and _UNICODE?

I've been looking at the command line generated by Visual Studio, and for one of my project it defines two symbols: _UNICODE and UNICODE. Now if I understand this document this rather old document, the _UNICODE symbol is a VC++ thing that causes certain standard functions to use wchar_t instead of char in their interfaces.

But what does the UNICODE without an underscore mean?

Upvotes: 58

Views: 37424

Answers (3)

Roman Ryltsov
Roman Ryltsov

Reputation: 69706

Raymond Chen explains it here: TEXT vs. _TEXT vs. _T, and UNICODE vs. _UNICODE:

The plain versions without the underscore affect the character set the Windows header files treat as default. So if you define UNICODE, then GetWindowText will map to GetWindowTextW instead of GetWindowTextA, for example. Similarly, the TEXT macro will map to L"..." instead of "...".

The versions with the underscore affect the character set the C runtime header files treat as default. So if you define _UNICODE, then _tcslen will map to wcslen instead of strlen, for example. Similarly, the _TEXT macro will map to L"..." instead of "...".

Looking into Windows SDK you will find things like this:

#ifdef _UNICODE
#ifndef UNICODE
#define UNICODE
#endif
#endif

Upvotes: 75

AminM
AminM

Reputation: 1824

In a nutshell,

UNICODE is used by Windows headers,

whereas

_UNICODE is used by C-runtime/MFC headers.

Upvotes: 37

Hans Passant
Hans Passant

Reputation: 942000

Compiler vendors have to prefix the identifiers in their header files with an underscore to prevent them from colliding with your identifiers. So <tchar.h>, a compiler header file, uses _UNICODE. The Windows SDK header files are compiler agnostic, and stone-cold old, it uses UNICODE without the underscore. You'll have to define both.

Upvotes: 24

Related Questions