1
char buf[10];
int counter, x = 0;
snprintf (buf, sizeof buf , "%.100d%n", x, &counter); 
printf("Counter: %d\n", counter)

I am learning about precision with printf. With %.100d%n, the precision here gives 100 digits for rendering x.

What I don't understand is why would the counter be incremented to 100, although only 10 characters are actually written to the buffer?

6
  • 2
    The 100 you're getting is the size that is actually needed to printf the whole thing without truncation. Anyway I'd not use %n alltogether (it's not implemented on all platforms), but rather use the return value of snprintf. Commented Mar 5, 2019 at 14:53
  • Take a look at what is actually written to your buf Commented Mar 5, 2019 at 14:54
  • @Jabberwocky %n isn't implemented on windows unless you use gcc -D__USE_MINGW_ANSI_STDIO=1 Commented Mar 5, 2019 at 14:54
  • 1
    it's the design of %n specifier, which can also be used as a security hole Commented Mar 5, 2019 at 14:54
  • 1
    @Jean-FrançoisFabre %n being a security hole isn't really a strong argument. See also stackoverflow.com/questions/54957216/… It seems more like just another Microsoftian non-portable incompatibility. Commented Mar 5, 2019 at 15:05

1 Answer 1

1

The ten bytes written to buf are 9 spaces and 1 '\0' (zero terminator), counter is assigned the value 100 for 99 spaces and 1 '0' (digit zero).

   buf <== "         "
%.100d <== "          .....   0"

Note that buf is incomplete: it does not have a '0'.

Sign up to request clarification or add additional context in comments.

1 Comment

I think OP wants to know why the counter gets the required value when the buffer is too short.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.