Flaz
Flaz

Reputation: 135

What's the difference between %i, %d and %D in C ? (printf)

I'm wondering what's the difference between %D, %d and %i in the printf function, because they all display integers, so why are there 3 formats to display integers? There must be a difference between these three formats?

Edit: I was also asking for %D, not only %i and %d.

Upvotes: 2

Views: 3140

Answers (2)

woz
woz

Reputation: 574

First, %D isn't a standard.

When it comes to %d and %i, there's no difference for output (e.g: printf), as pointed by some users on the comments and Oliver Charlesworth on his answer.

However, for input data (e.g.: scanf) you can use %i for scanning hexadecimal values (if preceded by 0x), or octal (if preceded by 0). Its default behavior will scan decimal values.

E.g: if you input some data using %i like 0x28, it will be the same as 40 in dec.

EDIT: Some code as example:

#include <stdio.h>
int main(){
    int dec, hex;
    scanf("%i",&hex); //For example, hex = 0x28
    scanf("%d",&dec); //For example, dec = 28
    printf("Hex: %d\nDec: %d\n",hex,dec); // outputs: Hex = 40, Dec = 28
}

Upvotes: 9

Oliver Charlesworth
Oliver Charlesworth

Reputation: 272487

There is no difference between %d and %i, and as far as I can tell, %D isn't a thing (or at least, maybe it's a compiler-specific extension).

See http://en.cppreference.com/w/c/io/fprintf, or section 7.19.6.1 of the e.g. the C99 standard.

Upvotes: 4

Related Questions