Reputation: 119
I got a bit confused about analzsis of data structure basic operation like inserting or deleting.
when I am asked about creating an data structure that support deleting operation, or inserting in O(1), (or any big O):
For deleting do I need take in account the searching of this element or only the deleting operation it self, for example changing the pointers?
For the inserting do I need to take in account the searching of the index/point where I need to place the element
I look a bit in the Internet but I got diffrenet time complexity for the basic data structure I learnedd, for example in Array, I saw both O(1) and O(N)
Thanks
Upvotes: 0
Views: 127
Reputation: 8846
The default answers are "yes, you do need to take the extra work into account".
To understand the questions, let's see why. Why do we ask questions about complexity of operations in a data structure? Most of the time, we want to use the data structure. Like a black box. The typical usage pattern is:
We want to estimate the time it typically takes to arrive from the former state to the latter. So, any and all operations in between should be accounted for.
That said, sometimes the question "what time does it take to do X not accounting for some preparation step Y" also makes sense.
For example, we can find the convex hull of n
points on a plane in O(n)
, provided that the points are already sorted in some convenient way.
The same task would take O(n log n)
for unsorted input.
Indeed there are cases where the input is already sorted, so both answers make sense in different contexts.
For another example, in a balanced binary search tree, we can find a specific element in O(log n)
. However, iterating over all n
elements takes only O(n)
time, instead of O(n log n)
. So, we say that iterating over an element in this scenario takes O(1)
amortized time.
Upvotes: 0
Reputation: 11240
Normally, the question "how much does it cost to insert?" or "how much does it cost to delete?" is asking about the operations:
structure.insert(value)
structure.delete(value)
so that the cost of finding where value
goes is part of the cost.
But some languages like Java have a list iterator, where you can iterate over a list and say "add an element at the current position" or "delete the current element". In that case, you already know where you are.
Upvotes: 0
Reputation: 167
To answer both of your questions, what is included in the time complexity of an operation depends on how much extra work you needed to do for that specific operation.
For instance, given an array and an element, delete would take O(n)
time and would include searching the element in the array. However, if that same array is assumed to have properties of a stack, the time complexity of delete operation would become O(1)
since we already know the element's position.
An example with the insert operation would be inserting an element to the end of a linked list. When implemented without a tail pointer, time is O(n)
, but with a tail pointer it is O(1)
.
Hence, analysing time complexities is less about operations having rigid time complexities as such and more about its implementation details.
Hope this clarifies things. Feel free to discuss any particular eg. you found confusing in the comments :)
Upvotes: 2