Kristofer
Kristofer

Reputation: 1487

How to estimate the runtime when the input size is unknown

I am learning how to analyze the algorithm (First 5 chapters) by myself using this book: Introduction of Algorithms. Assume I have the following simple algorithm (I made it up):

for i <- 0 to 100
  for j <- 1 to 2000
     if (i*10 + j*20 = n)
         c <- c+1

Now, if I want to estimate the running time of this algorithm for a specific input size, say n= 500, I'd say:

The running time in worst case is T(n) = 100*2000*500 = 10 * 10^7. However, when I want to generalize for any input size, i.e. n, I don't know how to do this! Kindly, can someone enlighten me?

Thank you

Upvotes: 0

Views: 101

Answers (1)

CoconutBandit
CoconutBandit

Reputation: 616

Your current running time is wrong. Since you always loop through all i and j values, it takes a total of 100*2000 iterations. This is independent of your n value. However, if you assume your unknowns are the max i and j values, denoted by I and J respectively, you could say your algorithm has running time O(IJ). You're on the right track, just replace your constants for variables.

Upvotes: 1

Related Questions