Reputation: 870
In lecture my professor was explaining the big-O time of various arithmetic operations. He told us that long-division is around O(n^2). Looking around online, it seems that this is correct, but why?
Can anyone go into detail in why long-division is in O(n^2) time?
Upvotes: 2
Views: 5803
Reputation: 59
First of all, some definitions:
The generation of each digit of the quotient requires two operations:
The computational complexity doesn't depend on the size of the dividend. For example, if you want to compute 1/3 to 5 places (.33333) it will take 5 iterations, but if you want to compute it to 20 places (.33333333333333333333) it will take 20 iterations.
The number of digits in the divisor is not actually a factor either. Say you want to compute 1/pi and you want the result to be accurate to 4 places. You don't need every digit of pi to compute this (which is convenient, because there are a lot!). You can ignore all but first few (where a few is approximately q) digits of the divisor. In this case, it is sufficient to use the 4 most significant digits of pi to get 1/pi accurate to 4 places: 1/3.141 ~= 0.3183 (for context, 1/pi with 20 digits of accuracy is 0.31830988618379067153).
Therefore you need q iterations, and there is O(q) work in each iteration, so the computational complexity of long division is O(q^2).
Upvotes: 1
Reputation: 4845
It's quadratic in the number of bits of the numbers you're dividing, which means that to divide n
by m
you need O((log max(m,n))^2)
time. This is because each subtraction takes O(log max(m, n))
time.
An explanation is available here on StackOverflow.
Upvotes: 5