Reputation: 206
def f1(n):
for i in range(n):
k = aux1(n - i)
while k > 0:
print(i*k)
k //= 2
def aux1(m):
jj = 0
for j in range(m):
jj += j
return m
I am trying to calculate the time complexity of function f1
, and it's not really working out for me.
I would appreciate any feedback on my work.
What I'm doing: I tried at first to substitute i=1
and try to go for an iteration, so the function calls aux
with m=n-1
, and aux1
iterates n-1
times, and returns m = n-1
, so now in f1
we have k = n-1
, and the while k > 0
loop runs log(n-1)
times. so basically for the first run O(n)
time complexity for f1
(coming from the call to function aux1
).
But now with the loop we continue calling aux1
with n = n-1, n-2, n-3 ... 1
, I am a little bit confused on how to continue calculating time complexity from here or if I'm on the right track.
Thanks in advance for any help and explanation!
Upvotes: 0
Views: 574
Reputation: 46
to find the factors I won't suggest the substitution approach for this type of question, rather try taking the approach where you actually try to calculate the order of functions on the basis of the number of operations they are trying to do.
Let's analyze it by first checking the below line
for i in range(n):
this will run for O(n) without any doubts.
k = aux1(n - i)
The complexity of the above line would be O( n * complexity of aux1(n-i))
Let's find the complexity of aux1(n-i) -> because of only one for loop it will also run for O(n) hence the complexity of the above line will be O(n * n)
Now the while loop will have a complexity of O(n * complexity of while loop)
while k > 0:
print(i*k)
k //= 2
this will run for log(k) times, but k is equal to (n-i) having an order of O(n)
hence, log(k) will be log(n). Making the complexity O(log(n)).
So the while loop will have a complexity of O(n*log(n)).
Now adding the overall complexities
O(nn) (complexity of aux1(n)) + O(nlog(n)) (complexity of while loop)
the above can be descibed as O(n^2) as big oh function requires the upper limit.
Upvotes: 1
Reputation: 9647
This is all very silly but it can be figured out step by step.
The inner loop halves k
every time, so its time complexity is O(log(aux1(n-i)))
.
Now what's aux1(n-i)
? It is actually just n-i
. But running it has time complexity n-i
because of that superfluous weird extra loop.
Okay so now for the inner stuff we have one part time complexity n-i
and one part log(n-i)
so using the rules of time complexity, we can just ignore the smaller part (the log) and focus on the larger part that's O(n-i)`.
And now the outer loop has i
run from 0
to n
which means our time complexity would be O(n^2)
because 1 + 2 + 3 + ... + n = O(n^2)
Upvotes: 1