Precision : printf Precision « printf scanf « C Tutorial






Precision indicates the minimum number of digits printed for type integers d, i, o, u, x, and X.

#include <stdio.h>
main()
{
    printf("%10.4d\n", 35);
}
0035
  1. 10 is the field-width.
  2. 4 is the precision.








4.9.printf Precision
4.9.1.For floating arguments, precision indicates how many digits are printed after decimal points.
4.9.2.Precision
4.9.3.Using precision while printing integers, floating-point numbers, and strings