Reputation: 41
I'm learning about Big-O Notation and algorithms to improve my interview skills, but I don't quite understand how to get the time complexity.
Suppose I want to sum all the elements of the following list.
std::vector<int> myList = {1,2,3,4,5} ;
Case 1:
int sum = 0;
for (int it: myList)
{
sum += it;
}
Case 2:
int sum = std::accumulate(std::begin(myList), std::end(myList), 0);
Case 1 is O(N), and case 2 is apparently O(1), but I'm sure those functions do some kind of iteration, so the question is whether Big-O notation is calculated only from of the written code of that block or also of the functions used.
Upvotes: 0
Views: 263
Reputation:
For an evaluation of the time complexity, it makes more sense to count the operations that take constant time. Hence sum
is not a particularly good candidate unless
the sums are always done on the same number of elements, or
the distribution of the sum lengths is known and independent of the circumstances where the calls are made (to avoid any bias).
Such evaluations are rather unusual.
Upvotes: 1
Reputation: 27630
case 2 is apparently O(1)
Says who? cplusplus.com says about accumulate
:
Complexity
Linear in the distance between first and last.
Which is the same O(N) as your case 1 code.
(I also checked cppreference.com but in this case it doesn't say something about the complexity.)
Upvotes: 0
Reputation: 106236
If you talk about big-O, you have to talk in respect of some unit of data being processed. Both your case 1 and case 2 are O(N) where N is the number of items in the container: the unit is an int
.
You tend to want the unit - and N to be the count of - the thing that's likely to grow/vary most in your program. For example, if you're talking about processing names in phonebooks, then the number of names should be N; even though the length of individual names is also somewhat variable, there's no expected pattern of increasing average name length as your program handles larger phonebooks.
Similarly, if your program had to handle an arbitrary number of containers that tended to be roughly the same length, then your unit might be a container, and then you could think of your code - case 1 and case 2 - as being big-O O(1) with respect to the number of containers, because whether there are 0, 1, 10 or a million other containers lying around someone in your program, you're only processing the one - myList
. But, any individual accumulate
call is O(N) with respect to any individual container's int
s.
Upvotes: 5
Reputation: 2077
I think this example should give you an idea.
int sum(std::vector<int> const& list)
{
int result = 0;
for( elem const& : list )
{
result += elem;
}
return result;
}
int main()
{
std::vector<int> test = {1,2,3,4,5,6};
// O(n)
int sum1 = 0;
for( elem const& : test )
{
sum1 += elem;
}
// O(???)
int sum2 = sum(test);
}
Upvotes: 1