user3366240
user3366240

Reputation: 85

calculate a running time of a function

I have trouble with coming up the running time of a function which calls other functions. For example, here is a function that convert the binary tree to a list:

(define (tree->list-1 tree)
  (if (null? tree)
  ’()
  (append (tree->list-1 (left-branch tree))
          (cons (entry tree)
          (tree->list-1 (right-branch tree))))))

The explanation is T(n) = 2*T(n/2) + O(n/2) because procedure append takes linear time. 
Solving above equation, we get T(n) = O(n * log n).

However, the cons is also a procedure that combines two element. In this case it goes though all the entry node, why don't we add another O(n) in the solution?

Thank you for any help.

Upvotes: 1

Views: 95

Answers (2)

soegaard
soegaard

Reputation: 31147

If I understand correctly, you are asking about the difference between append and cons.

The time used by (cons a b) does not depend on the values of a and b. The call allocates some memory, tags it with a type tag ("pair") and stores pointers to the values a and b in the pair.

Compare this to (append xs ys). Here append needs to make a new list consisting of the elements in both xs and ys. This means that if xs is a list of n elements, then append needs to allocate n new pairs to hold the elements of xs.

In short: append needs to copy the elements in xs and thus the time is proportional to the length of xs. The function cons uses the time no matter what arguments it is called with.

Upvotes: 1

KeV
KeV

Reputation: 2891

Consider O(n^2) which is clearly quadratic. Now consider O(n^2 + n), this still is quadratic, hence we can reduce this to O(n^2) as the + n is not significant (it does not change the "order of magnitude" (not sure this is the right term)).

The same applies here so we can reduce O([n*log(n)] + n) to O(n*log(n)). However we may not reduce this to O(log(n)) as this would be logarithmic, which is not.

Upvotes: 1

Related Questions