Rezaeimh7
Rezaeimh7

Reputation: 1545

C++ How to convert unicode character to int

I want to convert Unicode characters (Persian) to int. Based on this list, the Unicode number of 'آ' is U+0622.

Suppose i want to give U+0622 as integer value. I wrote this piece of code:

unsigned int Alef = (unsigned int)'آ';
std::cout << Alef << std::endl;

output:

63

Correct Answer is 1570 and as you see the output is wrong. I guess it only converts first byte of Unicode Character.

How do i convert that Unicode character to give correct answer?

Upvotes: 1

Views: 3716

Answers (1)

pointerless
pointerless

Reputation: 753

Try expressing the character as a wchar literal:

unsigned int Alef = (unsigned int) L'آ';
std::cout << Alef << std::endl;

But make sure you're saving as Unicode, nano, for example, converts the 'آ' to a '?' before saving. As would Notepad on Windows I think?

Also to add to my answer, you should write Unicode characters to std::wcout not std::cout as cout is for single byte chars and wcout is for wchar types.

EDIT: Notepad does save as Unicode

Upvotes: 5

Related Questions