Asha
Asha

Reputation: 11232

Character code different between C++ and C#

I have a string handling function in C++ as well as in C#. In C++ the code for the character ˆ is returned as -120 where as in C# it is 710. While building in C++ using visual studio 2010 I have set the character set as "Not set" in the project settings. In C# I am using System.Text.Encoding.Default during one of the conversions. Does that make any difference? How can I get same behavior in C++ as well as in C#?

Upvotes: 2

Views: 1533

Answers (1)

bames53
bames53

Reputation: 88155

The character is U+02C6. The encoding you're using in C++ is probably CP 1252 which encodes this character as the byte 0x88 (which is -120 when showing a signed char in decimal) . C# uses the encoding UTF-16, which encodes this character as 0x02C6 (710 in decimal).

You can use UTF-16 in C++ on Windows by using wchar_t insead of char.

You can't make C# strings use CP1252, but you can get byte arrays in different encodings from a String using Encodings.

byte[] in_cp1252 = Encoding.GetEncoding(1252).GetBytes("Your string here");

Upvotes: 7

Related Questions