user280105
user280105

Reputation: 33

Running time of merge sort on linked lists?

i came across this piece of code to perform merge sort on a link list.. the author claims that it runs in time O(nlog n).. here is the link for it... http://www.geeksforgeeks.org/merge-sort-for-linked-list/ my claim is that it takes atleast O(n^2) time...and here is my argument... look, you divide the list(be it array or linked list), log n times(refer to recursion tree), during each partition, given a list of size i=n, n/2, ..., n/2^k, we would take O(i) time to partition the original/already divided list..since sigma O(i)= O(n),we can say , we take O(n) time to partition for any given call of partition(sloppily), so given the time taken to perform a single partition, the question now arises as to how many partitions are going to happen all in all, we observe that the number of partitions at each level i is equal to 2^i , so summing 2^0+2^1+....+2^(lg n ) gives us [2(lg n)-1] as the sum which is nothing but (n-1) on simplification , implying that we call partition n-1, (let's approximate it to n), times so , the complexity is atleast big omega of n^2.. if i am wrong, please let me know where...thanks:)

and then after some retrospection , i applied master method to the recurrence relation where i replaced theta of 1 which is there for the conventional merge sort on arrays with theta of n for this type of merge sort (because the divide and combine operations take theta of n time each), the running time turned out to be theta of (n lg n)... also i noticed that the cost at each level is n (because 2 power i * (n/(2pow i)))...is the time taken for each level...so its theta of n at each level* lg n levels..implying that its theta of (n lg n).....did i just solve my own question??pls help i am kinda confused myself..

Upvotes: 1

Views: 413

Answers (2)

shebang
shebang

Reputation: 413

You are performing a sum twice for some weird reason.

To split and merge a linked list of size n, it takes O(n) time. The depth of recursion is O(log n)

Your argument was that a splitting step takes O(i) time and sum of split steps become O(n) And then you call it the time taken to perform only one split.

Instead, lets consider this, a problem of size n forms two n/2 problems, four n/4 problems eight n/8 and so on until 2^log n n/2^logn subproblems are formed. Sum these up you get O(nlogn) to perform splits. Another O(nlogn) to combine sub problems.

Upvotes: 0

Timothy Shields
Timothy Shields

Reputation: 79501

The recursive complexity definition for an input list of size n is

T(n) = O(n) + 2 * T(n / 2)

Expanding this we get:

T(n) = O(n) + 2 * (O(n / 2) + 2 * T(n / 4))
     = O(n) + O(n) + 4 * T(n / 4)

Expanding again we get:

T(n) = O(n) + O(n) + O(n) + 8 * T(n / 8)

Clearly there is a pattern here. Since we can repeat this expansion exactly O(log n) times, we have

T(n) = O(n) + O(n) + ... + O(n)   (O(log n) terms)
     = O(n log n)

Upvotes: 4

Related Questions