Reputation: 317
I have a function that return a vector<__int16>
with 44000 members.
vector<__int16> get_vector_by_param(char param);
Now I want to call that function like 10 times and concat the returned vector to a one big vector.
Something like that:
vector<__int16> whole_vector;
for(int i = 0 ; i < 10; ++i) {
vector<__int16> ret = get_vector_by_param(i);
whole_vector.append(ret);
}
So the size of the new vector will be nontrivial, about 0.5 MB.
What is the best way to do that in C++?
Maybe iterators? Will it help?
Upvotes: 0
Views: 137
Reputation: 504293
Design your function like the standard library does. Rather than always return a vector, accept an output iterator and insert elements into that:
template <typename OutputIterator>
OutputIterator get_vector_by_param(const char param, OutputIterator out) {
// Whatever you do to generate data:
for (unsigned i = 0; i != 44000; ++i) {
*out++ = static_cast<std::int16_t>(param);
}
return out;
}
Now you can use one vector to handle all of the data; you can reserve the memory up-front if you know exactly how many elements there will be:
std::vector<std::int16_t> whole_vector;
whole_vector.reserve(44000 * 10);
for(char i = 0; i < 10; ++i) {
get_vector_by_param(i, std::back_inserter(whole_vector));
}
You can still provide a convenience function:
std::vector<std::int16_t> get_vector_by_param(const char param) {
std::vector<std::int16_t> ret;
ret.reserve(44000);
get_vector_by_param(param, std::back_inserter(ret));
return ret;
}
Finally, unless you know you have some constraints to meet up-front, you should just design things in whatever way makes it easier for you to get work done. Let data (i.e., your profiler) tell you where things are slow and what can be improved.
Trying to get everything perfect the first try is way harder (and often wrong & buggy) compared to doing it cleanly first then optimizing.
Upvotes: 4