murphy
murphy

Reputation: 11

Divide and Conquer Algorithms Numerical

If a file of size n=1000 takes 10 ms for sorting using randomized quick sort algorithm,then approximately how much time would it take to sort a file of size n=1000000000?(assume that all data are available in the main memory)

Upvotes: 0

Views: 331

Answers (1)

marc1s
marc1s

Reputation: 779

If in general the average time (or number of basic operations) for randomized Quicksort is O(n log(n)) and for n=10^3 takes 10ms, means that relation 10 = t 10^3 log(10^3), where t is the time for operations holds. From previous relation you obtain the time your computer spends with one basic operation t=10/(10^3 log(10^3)) ms. Therefore, the time to finish with n=10^9 is t 10^9 log(10^9). Substituting with t=10/(10^3 log(10^3)) you get that your computer needs 10/(10^3 log(10^3)) 10^9 log(10^9) ms, or 10^7 9/3 ms.

is it that what you were looking for?

Upvotes: 2

Related Questions