Reputation: 9012
To print a number of type off_t
it was recommended to use the following piece of code:
off_t a;
printf("%llu\n", (unsigned long long)a);
Upvotes: 14
Views: 5613
Reputation: 183361
The format string doesn't tell the compiler to perform a cast to unsigned long long
, it just tells printf
that it's going to receive an unsigned long long
. If you pass in something that's not an unsigned long long
(which off_t
might not be), then printf
will simply misinterpret it, with surprising results.
The reason for this is that the compiler doesn't have to know anything about format strings. A good compiler will give you a warning message if you write printf("%d", 3.0)
, but what can a compiler do if you write printf(s, 3.0)
, with s
being a string determined dynamically at run-time?
Edited to add: As Keith Thompson points out in the comments below, there are many places where the compiler can perform this sort of implicit conversion. printf
is rather exceptional, in being one case where it can't. But if you declare a function to accept an unsigned long long
, then the compiler will perform the conversion:
#include <stdio.h>
#include <sys/types.h>
int print_llu(unsigned long long ull)
{
return printf("%llu\n", ull); // O.K.; already converted
}
int main()
{
off_t a;
printf("%llu\n", a); // WRONG! Undefined behavior!
printf("%llu\n", (unsigned long long) a); // O.K.; explicit conversion
print_llu((unsigned long long) a); // O.K.; explicit conversion
print_llu(a); // O.K.; implicit conversion
return 0;
}
The reason for this is that printf
is declared as int printf(const char *format, ...)
, where the ...
is a "variadic" or "variable-arguments" notation, telling the compiler that it can accept any number and types of arguments after the format
. (Obviously printf
can't really accept any number and types of arguments: it can only accept the number and types that you tell it to, using format
. But the compiler doesn't know anything about that; it's left to the programmer to handle it.)
Even with ...
, the compiler does do some implicit conversions, such as promoting char
to int
and float
to double
. But these conversions are not specific to printf
, and they do not, and cannot, depend on the format string.
Upvotes: 16
Reputation: 49813
The signature of printf
looks like this:
int printf(const char *format, ...);
The vararg...
indicates that anything can follow, and by the rules of C, you can pass anything to printf
as long as you include a format string. C simply does not have any constructs to describe any restrictions for the types of objects passed. This is why you must use casts so that the objects passed have exactly the needed type.
This is typical for C, it walks a line between rigidity and trusting the programmer. An unrelated example is that you may use char *
(without const
) to refer to string literals, but if you modify them, your program may crash.
Upvotes: 3
Reputation: 229108
The problem is you don't know how big an off_t is. It could be a 64 bit type or a 32 bit type (or perhaps something else). If you use %llu, and do not pass an (unsigned) long long type, you'll get undefined behavior, in practice it might just print garbage.
Not knowing how big it is, the easy way out is to cast it to the biggest reasonable type your system supports, e.g. a unsigned long long. That way using %llu is safe, as printf will receive an unsigned long long type because of the cast.
(e.g. on linux, the size of an off_t is 32 bit by default on a 32 bit machine, and 64 bit if you enable large file support via #define _FILE_OFFSET_BITS=64
before including the relevant system headers)
Upvotes: 5