Reputation: 107
I'm trying to print out three columns:
Value: Datatype: Size(in bytes):
to the console:
#include <stdio.h>
/*
print out the primitive (basic built in) data types for C
along with their size in bytes.
*/
int main(){
char letter = 'j';
short int x = 55437;
long int y = 1234567;
double dbl = 5.99;
float boobs = 4.66;
printf("Value: Datatype: Size(in bytes):\n\n");
printf("%c\t char\t %c\n", letter, sizeof(letter));
printf("%d\t short\t %d\n", x, sizeof(x));
printf("%ld\t long\t %ld\n", y, sizeof(y));
printf("%lf\t double\t %lf\n", dbl, sizeof(dbl));
printf("%f\t float\t %f\n", boobs, sizeof(boobs));
}
I almost got it, but I'm still having some incorrect output:
Why is my short int printing out as -10099 (contradicting that I had already assigned it to 55437)? And why are my double and float data type specifiers in the Datatype column being shifted so far right?
Upvotes: 0
Views: 158
Reputation:
try this :)
printf("%c\t char\t %g\n", letter, sizeof(letter));
printf("%d\t short\t %g\n", x, sizeof(x));
printf("%ld\t long\t %g\n", y, sizeof(y));
printf("%g.2\t double\t %g\n", dbl, sizeof(dbl));
printf("%f.2\t float\t %g\n", boobs, sizeof(boobs));
Upvotes: 0
Reputation: 40145
try this
printf("%c\tchar\t %d\n", letter, (int)sizeof(letter));
printf("%hd\tshort\t %d\n", x, (int)sizeof(x));//55437 > max of int16
printf("%ld\tlong\t %d\n", y, (int)sizeof(y));
printf("%f\tdouble\t %d\n", mile, (int)sizeof(mile));
printf("%f\tfloat\t %d\n", wage, (int)sizeof(wage));
Upvotes: 1
Reputation: 122423
When a value is printed with "%c"
in printf
, it's converted to unsigned char
. However, for ASCII, only values ranging from 0x20
to 0x7E
are printable characters. How the other characters are shown is up to the terminal.
Upvotes: 1