Reputation: 95
I'm still very new to c++ so this might be a stupid question. In the code below why is it that when i change the type of index to a signed char, index > 25
evaluates to true. Isn't a signed char just a 1 byte integer?
#include <iostream>
using namespace std;
char lowercase [26] = {'a','b','c','d','e','f','g','h','i','j','k','l','m','n','o','p','q','r','s','t','u','v','w','x','y','z'};
int main() {
short index;
cout << "Enter a number 0 to 25: ";
cin >> index;
if (index > 25 || index < 0) {
cout << "That number is out of range." << endl;
return 0;
}
cout << "The lowercase letter for this number is " << lowercase[index] << "." << endl;
return 0;
}
Upvotes: 2
Views: 45
Reputation: 467
This is because a when the user enters a value between 0 to 25, the value that is evaluated pertain to the ascii table 0 == 48 (ascii). The datatype char will only that in the first value that is entered, single digit/letter e.g. char foo = "1"
Upvotes: 0
Reputation: 32732
Let's rephrase the problem just a bit:
char index;
cin >> index;
It may be easier to see the problem. When you read input into a char
, you get the character code for the first character entered by the user. In a typical system, this is ASCII, and the code for digits is between 48 and 57.
So when you get input into index
when it is a signed character, you'll get a value that is >= 48.
Upvotes: 3