Reputation:
I have studied a lot of algorithms and it appears that, for an algorithm to reach a peak complexity for an operation, they have to sacrifice the other complexity. I want to understand what is preventing them from being inversely proportional.
Upvotes: 0
Views: 1354
Reputation: 19601
To exaggerate: economics says that they are inversely proportional, but mathematics says that they are not. If you plot the time and space complexity of a large number of algorithms, all of the competitive algorithms will lie on a Pareto curve trading time against space, because an algorithm that takes both more time and more space than its competition is not competitive. There are even a small number of algorithms called time-memory tradeoffs. But in the very long term the two are theoretically linked. It takes time at least N to use N space, and if you only have N bits of space for a finite state machine it has only 2^N possible states, so it can't possibly use more than 2^N time without going into an infinite loop.
There is a very theoretical approach in "A machine-independent theory of the complexity of recursive functions" by Blum (JACM 1967). Theorem 3 provides a proof of a link between any two complexity measures, which I think is however so loose as to be completely useless for any practical purpose. It does say, "This means that Phi(n) and Phi'(n) do not differ too much from each other" (where I have translated the symbols in the table as Phi and Phi') - but I think that to agree with "do not differ too much" you have to accept that n and 2^n are pretty much the same thing.
Upvotes: 1
Reputation: 7714
Definitive answer is no, but a smoother answer is not necessarily.
Sometimes, when you are solving a problem, you may think: "I can approach that by doing brute-force, which will take me O(n²) time and O(1) space... but maybe I can build up this data-structure and take O(n log(n)) space, but I can answer it in O(log(n))".
In lots of cases, there are these trade-offs between time and space complexity. You afford yourself to use more space, preprocess the input, build structures and you can answer faster, but sometimes you save your memory and just run straight-forward algorithms.
The fact is that not all problems have this trade-off. For example, if I ask you: "Can you tell me the sum of all elements of an array?" - certainly you can build a tree with the array and perform a traversal on it and return the sum, wasting O(n) space and time, but you don't have to do this, you can just sum it all and return in O(n) time, using no extra space.
In a nutshell: in a lot of problems, the many possible solutions are developed over different trade-offs between time and complexity, but not all of them, and as they are not inversely proportional in all cases, they are not inversely proportional.
Upvotes: 1
Reputation: 59184
Time and space complexity are certainly not inversely proportional. It's just a simple mathematical fact that you can only optimize a system for one metric at a time.
If you optimize an algorithm for time, then it's not likely to be optimal in space. If you optimize for space, then it's not likely to be optimal in time. If you optimize for something else, then it's not likely to be optimal in either time or space.
Generally we pick a practical trade-off. We are helped somewhat by the fact that it takes time to fill space, so time and space complexity are actually positively correlated in a lot of cases.
Upvotes: 3