Reputation: 1578
what happens if I read an integer like 20,30,10000...9999 into variable a ? it only prints the first digit in the number that I've read...why is that ? for example if I read 123, on the screen it prints 1. Isn't it supposed to convert the integer 123 into it's equivalent ASCII character representation ?
#include <stdio.h>
int main() {
char a;
scanf("%c", &a);
printf("%c", a);
return 0;
}
This is an exam question from C language.
Upvotes: 0
Views: 13666
Reputation: 5359
No, it will read only the first character into the char variable. How can a char variable store more than one character at an instant? It can't.
So if you want the ASCII value, input as an integer instead.
int a;
scanf("%d", &a); // suppose input is 65
printf("%c", a); // prints 'A'
printf("%d", a); // prints 65
Whereas
char a;
scanf("%c", &a); // suppose input is 65
printf("%c", a); // prints '6'
printf("%d", a); // prints 54 which is the ASCII value of '6'
Upvotes: 1
Reputation: 399833
No, it reads the character, which is represented by the machine as a small integer, into the variable.
If you enter 100 (the number 100, three keypresses and thus three characters), it will only store the first character of that, i.e. the leading 1
.
If you wanted to convert a number to an actual integer, you should use %d
and an int
variable of course.
Printing with %c
will print back a single character, by interpreting the small integer value as a character (rather than as an integer). So for an input of 100 you will see 1
printed back out, i.e. the character that represents the decimal digit one.
If you want to print out the numeric representation of the character you read in, scan with %c
but print with %d
, and cast the char
to (int)
in the printf()
call.
Upvotes: 3
Reputation: 1147
The problem is that %c parse a char for console input. From a number like 123 it take only the first letter and dispose the rest. The way to parse a int value is using %d on the scanf function.
Upvotes: 1