smallB
smallB

Reputation: 17120

What is the time complexity for a clear function is std::map according to big O?

What is the time complexity for a clear function is std::map?
Would I be right in saying that it is O(1)?

Upvotes: 3

Views: 8536

Answers (5)

Pavan Karke
Pavan Karke

Reputation: 1

If you want to erase all elements it'll take O(N) complexity, but you can assign it to empty map each iteration, instead of clean

map<int,int>m1,m2;

computations on m1 updated

m1.clear()//instead of O(N)

do

m1=m2; //O(1)

Upvotes: 0

Matthieu M.
Matthieu M.

Reputation: 299770

The Standard says in [associative.reqmts]/8 Table 102:

a.clear() <=> a.erase(a.begin(), a.end()) linear in a.size()

So it is actually mandated to be O(N).


EDIT: summing up the various bits.

To remove a node, a map does two operation:

  1. Call the allocator destroy method to destroy the element
  2. Call the allocator deallocate method to deallocate the memory occupied by the node

The former can be elided in code (checking for is_trivially_destructible), and actually it is generally done in vector for example. The latter is unfortunately trickier, and no trait exists, so we must rely on the optimizer.

Unfortunately, even if by inlining the optimizer could completely remove the destroy and deallocate nodes, I am afraid it would not be able to realize that the tree traversal is now useless and optimize that away too. Therefore you would end up in a Θ(N) traversal of the tree and nothing done at each step...

Upvotes: 7

Tony Delroy
Tony Delroy

Reputation: 106086

Because it's a template, it may be known at compile time that destruction in a no-op for the type (e.g. std::map<int>), so the need to destroy members isn't a good basis for deducing a necessary worst-case performance. Still, the compiler must visit every node of the binary tree, releasing the heap memory, and the number of nodes relates linearly to the number of elements (that erase() only invalidates iterators/references/pointers to the erased element, insert() doesn't invalidate any etc. all evidence the 1:1 relationship).

So, it's linear, but because of the need to clean up the heap usage even if element destructors aren't needed....

(Interestingly, this implies that a std::map<>-like associative container - or perhaps std::map<> itself with a clever custom allocator - could be made O(1) for elements with trivial no-op destructors if all the memory was allocated from a dedicated memory pool that could be "thrown away" in O(1).)

Upvotes: 2

coanor
coanor

Reputation: 3944

As I know, all the clean-operation's complexity is O(n), because you need to destuct these objects one by one.

Upvotes: 1

Chris
Chris

Reputation: 8170

The cplusplus reference site claims it has linear complexity in the container's size as the destructor of each element must be called.

Upvotes: 2

Related Questions