Reputation: 13487
I have a method which checks if an array is a Heap. On every recursive call, I create 2 new recursive calls on the left subtree and the right subtree to traverse the nodes and check that their values are correct.
I want to calculate BigO of this. I think that in the worst case it is O(n) because if it IS a Heap, then it never stops early and needs to visit every node. I think the best case is O(3), and that would occur when it checks the very first left subtree and right subtree and both return false (not a Heap).
My question is: does this logic make sense? I think it does, but whenever I see the time complexity of recursive functions they always seem to be in some form of logarithmic time. It is almost as if there is some mysterious quality to recursive functions that nobody is explicitly stating. How come recursive functions often times process things in logarithmic time? And is my above logic valid?
Upvotes: 0
Views: 169
Reputation: 181
Yes it makes sense. The reason that you see most algorithms take logarithmic time is because it repeats over something and keeps divide the scope by some factor.
Upvotes: 1
Reputation: 65448
Yes, that makes sense. Only one of the three cases of the Master Theorem (though arguably the most interesting) has a logarithm.
Upvotes: 1