Reputation: 420
I was just reading the classic K&R and encountered the following syntax:
printf("%.*s",max,s);
What is the meaning of "."
here?When I don't apply a "."
here,then whole string is printed,but when we don't apply a "."
,atmost max characters are printed.I will be really thankful if anyone could explain this.
Upvotes: 12
Views: 4514
Reputation: 1
It actually print the character in limited number and commanded format based on the function passed to printf,in this case it should print the specific of "max"
Upvotes: 0
Reputation: 22964
Dot has different meaning with different format strings . If you use a.b with %f , then b gives length . a gives number of decimal points . if you use a.b with %s , a will give minimum length of the string . whereas b gives maximum length of the string that will be printed .
Upvotes: 1
Reputation: 155046
A printf format string allows specifying width and precision.
Width, such as %25s
, tells printf
to pad the string to the width of 25 characters, inserting spaces before the string. (If the string is wider than 25 characters, it is still printed in its entirety.)
A "precision" applied to a string format, such as %.25s
, limits the length of the printed string to 25 characters. A string of 3 characters will be printed fully (with no padding), and a string of 30 characters will be missing its last five characters.
%.*s
avoids hardcoding the precision in the format, specifying it instead as an integer argument to printf
, in your case max
.
Upvotes: 4
Reputation: 222900
In %.*s
, the .*
limits the number of bytes that will be written. If this were written with a numeral included, such as %.34s
, then the numeral would be the limit. When an asterisk is used, the limit is taken from the corresponding argument to printf
.
From C 2011 (N1570) 7.21.6.1 4, describing conversion specifications for fprintf
et al:
An optional precision that gives … the maximum number of bytes to be written for s conversions. The precision takes the form of a period (.) followed either by an asterisk * (described later) or by an optional decimal integer; if only the period is specified, the precision is taken as zero.
Upvotes: 12
Reputation: 8020
First of all, K&R is the original implementation of C, which is different from the current specification. If you want specific information about K&R C, then consult specific documentation.
From the current C standard:
An optional precision that gives (...) the maximum number of bytes to be written for s conversions. The precision takes the form of a period (.) followed either by an asterisk * (described later) or by an optional decimal integer.
http://www.open-std.org/jtc1/sc22/wg14/www/docs/n1570.pdf
Similar documentation is available everywhere online for multiple standards or implementations:
Upvotes: 2
Reputation: 4320
It specifies the "Character String Maximum field width"
The precision within a string format specifies the maximum field width:
%2.6s
specifies a minimum width of 2 and a maximum width of 6 characters. If the string is greater than 6 characters, it will be truncated.
Upvotes: 13