Reputation: 299
I have some algorithms that I am trying to find the time complexity for. I came up with some answers but I am not certain if they are right or not. Can someone help me out?
public static int f4(int N){
if (N == 0) return 0;
return f4(N/2) + f1(N) + f4(N/2);
} // f1 had a time complexity of O(n)
1) I believe this one is O(n).
public static find f6(int N){
if (N == 0) return 1;
return f6(N-1) + f6(N-1);
}
2) I believe this is O(n!) or O(n).
Upvotes: 0
Views: 1436
Reputation: 11
The function f4(N)
calls itself twice with a reduced problem size of N/2
. For each recursive call, it also perform an operation f1(N)
, which has a time complexity of O(N)
.
The recurrence relation for this function can be written as follows,
T(N) = 2T(N/2) + O(N)
2T(N/2)
- Because, function call itself with the input size halved.
O(N)
- Because, each recursive call has time complexity of O(N)
.
To solve this we can use Master Theorem
General form for divide and conquer recurrences,
T(N) = aT(N/b) + O(N^d)
Where,
a = 2
: Because, function call itself with twice.b = 2
: Because, problem size is halved in each call.d = 1
: Because, work done in each call is linear. O(N)
Now, Calculating log b a = log 2 2 = 1
According to Master theorem, when log b a = d
, the time complexity is O(N log N)
.
Therefore, the time complexity of f4(N)
is O(N log N)
, not O(N)
.
The function f6(N)
calls itself twice for each value of N
. In other words, for each recursive call, two additional calls are made with the arguments N-1
.
The recurrence relation is,
T(N) = 2T(N-1) + O(1)
2T(N-1)
- Because, function call itself with decreased input size of N-1
.
O(1)
- Because, the work done outside of the recursive calls is constant.
If we expand the recurrence tree, we can visualize the growth of recursive call,
And so on, where the number of calls doubles at each level
the total number of calls grows exponentially, with 2^N
calls made at the final level.
Thus, the time complexity of this function is O(2^N)
. which is exponential is nature.
Therefore, the time complexity of f6(N)
is O(2^N)
, not O(N!)
or O(N)
Upvotes: 0
Reputation: 1
I Think is better solve this problem recursively just like the calling method does. for function f6 :
T(n) = 2T(n-1) + c = > T(n-1) = 2T(n-2) + c
Then : T(n) = 2(2T(n-2)) + c) + c
T(n) = 4T(n-2) + 2c + c => T(n-2) = 2T(n-3) + c
T(n) = 4(2(T(n-3) + c) + 2c + c
T(n) = 8T(n-3) + 4c + 2c + c
.
.
.
T(n) = 2^i T(n - i) + 2^(i-1)c + 2^(i-2)c + ... + (2^0)c
When i asymptotically goes to n , that is : i => n , then :
T(n) = 2^n T(n-n) + 2^(n-1)c + 2^(n-2)c + ... + (2^0)c
T(n) = O(2^n)
Upvotes: 0
Reputation: 12801
The Akshat answer is in detail.
https://stackoverflow.com/a/43880977/6866309
The second function will become O(n!), if it has a loop inside the function.
Something like below example is O(n!).
f6(int N){
if (N == 0) return 1;
for(int i=0; i<N; i++) {
f6(N-1) + f6(N-1);
}
}
Upvotes: -1
Reputation: 9846
The master theorem states that, given a recurrence of the form f(n) = a*f(n/b) + O(n^k)
, three types of complexity may result:
O(n^k)
if log_{b}(a) < k
O(n^k log n)
if log_{b}(a) == k
O(n^(log_{b}(a))
if log_{b}(a) == k
The first recurrence takes the form:
f4(N) = 2f4(N/2) + O(n) = 2f4(N/2) + O(n^1)
This clearly falls into the second case, as log_{2}(2) == 1
. The correct answer is therefore O(n log n)
.
Your first problem has polylogarithmic time complexity.
The recurrence takes the form
f6(N) = 2f6(N - 1)
which you can expand as follows:
f6(N) = (2^k)f6(N - k)
by simply applying the relationship to f6(N-1)
, f6(N-2)
, etc.
Since f6(0)
returns instantly i.e. it takes 1 step to return, the full runtime complexity is
f6(N) = 2^N
just by plugging in k == N
.
Thus, the second problem has exponential time complexity.
Upvotes: 1