Tristan Alexander
Tristan Alexander

Reputation: 25

Ascii character codes in C++ visual studio don't match up with actual character symbols?

I am trying to represent 8 bit numbers with characters and I don't want to have to use an int to define the character, I want to use the actual character symbol. But when I use standard ASCII codes, everything outside of the range 32-100 something are completely different.

So I looped through all 256 ascii codes and printed them, for instance it said that this character '±' is code 241. But when I copy and paste that symbol from the console, or even use the alt code, it says that character is code -79... How can I get these two things to match up? Thank you!

Upvotes: 0

Views: 696

Answers (1)

xanatos
xanatos

Reputation: 111820

It should be a problem of codepage. Windows console applications have a codepage that "maps" the 0-255 characters to a subset of Unicode characters. It is something that exists from the pre-Unicode era to support non-american character sets. There are some Windows API to select the codepage for the console: SetConsoleOutputCP (and SetConsoleCP for input)

#include <iostream>

#define WIN32_LEAN_AND_MEAN
#include <Windows.h>

int main() {
    unsigned int cp = ::GetConsoleOutputCP();

    std::cout << "Default cp of console: " << cp << std::endl;

    ::SetConsoleCP(28591);
    ::SetConsoleOutputCP(28591);

    cp = ::GetConsoleOutputCP();

    std::cout << "Now the cp of console is: " << cp << std::endl;

    for (int i = 32; i < 256; i++) {
        std::cout << (char)i;

        if (i % 32 == 31) {
            std::cout << std::endl;
        }
    }
}

The 28591 codepage is the iso-8859-1 codepage that maps the 0-255 characters to the unicode 0-255 codepoints (the 28591 number is taken from https://learn.microsoft.com/en-us/windows/win32/intl/code-page-identifiers)

Upvotes: 1

Related Questions