Reputation: 11441
Here is program for memoized version matrix chain multiplication program from Introduction to Algorithms by cormen etc
MEMOIZED-MATRIX-CHAIN(p)
1 n length[p] - 1
2 for i = 1 to n
3 do for j = i to n
4 do m[i, j] = infinity
5 return LOOKUP-CHAIN(p, 1, n)
LOOKUP-CHAIN(p, i, j)
1 if m[i,j] < infinity
2 then return m[i, j]
3 if i = j
4 then m[i, j] 0
5 else for k = i to j - 1
6 do q = LOOKUP-CHAIN(p, i, k) +
LOOKUP-CHAIN(p, k + 1, j) +
p(i - 1)* p(k) *p(j)
7 if q < m[i, j]
8 then m[i, j] = q
9 return m[i, j]
It is mentioned in description as we can categorize the calls of LOOKUP-CHAIN into two types:
There are n square calls of first type, one per table entry. All calls of the second type are made as recursive calls by calls of the first type. Whenever, a given call of LOOKUP-CHAIN makes recursive calls, it makes "n" of them. Therfore, there are n cube calls of the second type in all. Each call of the second type takes O(1) time, and each call of the first type takes O(n) time plus the spent in recursive calls. There for total time is O(n cube).
My question is
Thanks!
Upvotes: 4
Views: 4990
Reputation: 16054
I think they meant that if you consider the tree of recursive calls, the calls of the second type appear as leaves (no more recursion), while the calls of the first type are the inner nodes of the tree. So yes, the second type is called by the first type.
The for
loop on line 5 of LOOKUP-CHAIN
performs at most n iterations (because 1≤i≤j≤n) so it can make up to 2n=O(n) recursive calls.
Maybe it is easier to understand the complexity of this recursive algorithm if you picture it as a tree:
The inner node corresponds to "type 2", there cannot be more than n² of them (because of the memoization), and they each perform O(n) operations
The leaves correspond to "type 1". Because there are at most n² inner nodes with at most 2n children, so you cannot have more than 2n³ leaves and each of these performs θ(1) operations
Upvotes: 2