MatsuzakaSteven
MatsuzakaSteven

Reputation: 317

Big O and Running Time?

Let's say we have two algorithms that solve the same problem.

Algorithm A has O(n) running-time complexity, and Algorithm B has O(n3) running-time complexity.

For an input of the size of 1000, A takes 1 second, and B takes 3 seconds.

Having said that, how long will Algorithms A and B take for input of size 3000?

My guess is:

Second question: If I were to have an algorithm with running-time complexity O(n), and I were to ram up the processor speed and memory size to twice of the original, will the running-time complexity for the algorithm still be the same in Big O, which is O(n)?

Please provide an explanation.

Upvotes: 1

Views: 1537

Answers (3)

Gordon Linoff
Gordon Linoff

Reputation: 1269503

This is a little long for a comment.

Big-oh notation is about what happens as the data goes to infinity. If you have one data point:

n=1000, time=1 sec

You don't have enough information to generalize. This could be:

per n, 1/1000 seconds, no overhead
per n, 1/1000000 seconds, 0.999 overhead

You could also have small factors -- such as sqrt(n) or log(n) in the actual run time.

And for the second algorithm, the situation is more complicated still.

The important thing to remember is that big-O is a theoretical measure of algorithmic complexity. It does not substitute for actually analyzing a specific algorithm on a specific machine

EDIT:

Technically, two functions have the same "big-Oh" when this limit is bounded to a non-NULL value:

f1(n)
-----  as n --> infinity
f2(n)

For convenience, we specify big-Oh using simple functions, such as n^2, log(n), e^n, and so on.

It is only a statement about what happens as the size goes to infinity. Period. There are many examples of algorithms with poor complexity that work pretty well in real-world problems. For instance, quicksort has a worst-case complexity (hence "complexity") of O(n^2), but works so well in the average case that it is often the sorting algorithm of choice.

Upvotes: 1

chux
chux

Reputation: 153338

how long will Algorithms A and B take for input of size 3000?

If run time for n==0 is 0 seconds, then the 2 guesses are reasonable predictions.


I were to ram ramp up the processor speed ... to twice of the original,

Run times expect to be half, assuming the limiting factor was processor speed and not I/O throughput, etc. but it does not change big(O)

... to memory size to twice of the original ...

Again, does not change big (O). I would not expect a run-time reduction unless the process was memory bound.

Upvotes: 1

Ulrich Eckhardt
Ulrich Eckhardt

Reputation: 17415

All these complexity claims start with a sentence which you have forgotten in your calculations:

There is a n0 so that for all n greater than n0 ...

Since you have no info about this n0 value, you can't draw any conclusions concerning the timing behaviour of the two algorithms.

Just to illustrate this with an example, consider the possibility that the input data set fits into the CPU cache or not. Once it grows too big for the cache, there is usually a large increase in runtime.

Upvotes: 0

Related Questions