anshdeb
anshdeb

Reputation: 1

Running time of printf() function

I was just wondering what actually is the running time of the printf() function. Whether it be O(n) or Ω(n), on what basis can I decide the running time of the printf function? Like if it just prints a single line on the screen it can be O(1) since it would take only a particular number of CPU cycles to run that but what if printf is used in a loop where it is not predefined as to how many times the printf() function would be called, what then? Will it be O(n) or what?

Upvotes: 0

Views: 3438

Answers (2)

Keith Thompson
Keith Thompson

Reputation: 263187

The C standard says nothing about the running time of printf; it only specifies its behavior. Different implementations might be more or less efficient than others.

You asked about O(n) or Ω(n), not about absolute running time. I can't think of any reason why printf shouldn't be O(n) where n is the number of characters it prints. (I don't remember the distinction between O(n) and Ω(n); it's been a while since college.)

Though I suppose you could construct a format string with a very long sequence of %n specifiers which would probably take time proportional to the length of the format string while producing no output. So in the most general case I suppose it's O(m+n), where m is the length of the format string and n is the number of characters printed.

The implementation of printf will traverse the format string and the other arguments, producing output characters as it goes. I don't think there are any cases where the likely complexity exceeds O(n).

If you call printf in a loop in your program, you're no longer asking about the run time of printf, but about the run time of your algorithm that happens to call printf. That's impossible to answer without more information.

Upvotes: 0

ikegami
ikegami

Reputation: 385546

The time taken by print will always be proportional to the number of characters to print, so it's Θ(N) (and thus O(N) and Ω(N)), where N is the number of characters to print.


But what it N wasn't the number of characters?

When evaluating algorithms that operate on list of strings (e.g. sorting algorithms), it's usually uninteresting to study what happens when the length of the string grows as it'll affect all algorithms identically. We're more interested in studying what happens when the size of the list grows. To do that, we assume the length of the strings is constant in those cases.

For example, a sorting algorithm might take O((N log N)*M) where N is the number of elements and M is the length of the string. If we consider the length of the strings constant, that becomes O(N log N), and the print goes from Θ(M) to Θ(1).

Upvotes: 1

Related Questions