Reputation: 737
I just started programming in the C prog. language and want to get the data type-size-value subject straight.
I've seen a few value-range tables of different data types (such as this one).
The thing is, I've learned and read here that there are different parameters which influence the size of each data type, and my assumption is that the value-range should vary as well.
For example, if 1 byte = 16 bit
then I'd think signed char
could hold 0-65535.
How accurate are those tables? Is the range they show guaranteed (but the types could actually hold also smaller\larger values)?
Upvotes: 1
Views: 634
Reputation: 121387
The C language specification doesn't define any exact range for each data type. It only defines a minimum value that a particular type should be able to hold.
Coming to your question on that table, it's NOT the accurate representation of ranges for defined by C. It may be true on a particular platform that the author was running it on. But it can't always be (and shouldn't be) taken as the authoritative source.
If you want to know the exact range on your platform, look at(or include) <limits.h>. Or you can use sizeof
operator on the types to get the information from compiler.
If you are want to know the exact number of bits then use CHAR_BIT
defined in <limits.h>
.
For example, the number of bits in an int can be found using: CHAR_BIT * sizeof(int)
.
In same way for a given type T
, number of bits can be found: CHAR_BIT * sizeof(T)
.
Also read the first 3 or 4 questions from the C-FAQ which are quite relevant to your question.
Upvotes: 4
Reputation: 126
Your thought process is more or less correct. These tables are generally reliable because they are easy calculations to do given that you know their size.
Chars will always ever be one byte big (which is 8 bits, not 16), and that one byte will only ever have 2^8=256 possible combinations, so the range of a char will only ever be 0 to 255 or -128 to 127 depending on whether it's signed or not.
For our other integers, the same logic applies. The only difference here is that the size of these types are dependent on the operating system that you compile for (which the table acknowledges, giving different ranges for an int of 2 bytes and an int of 4 bytes).
There are no other parameters that will affect the values these types can hold besides their size in bytes though, and if you are doing something that is dependent on their size (like integer overflow) you should be using sizeof() to check for that.
Upvotes: 0
Reputation: 5421
C is a "close-to-metal" language, therefore some things (like size of int) depend on the particular architecture you're compiling for. It is always know before your program leaves your hands, so you can easily take care of it with sizeof and #defines
Tables found anywhere are only for reference. You can depend only on what's visible to the compiler.
Upvotes: 0
Reputation: 742
The minimum range shown there must be available. It is the minimum guaranteed by the standard all conforming implementations will supply at least that.
Upvotes: 0