WewillSee
WewillSee

Reputation: 299

C++ pointers vs std::vector: any implication for long size variables?

I have a C background and I am recoding some old code in C++... In the process, I am starting to use C++ Vectors, which are so easy to use!

Would vectors deal well with very long streams of data? For example, in audio apps, loading a stereo 3 minutes song would need almost 16M floats

float *stereoSong = NULL;
stereoSong = new floats[15787800];

Not having to deal with memory management with vectors is very nice, but I was wondering if that huge amount of data would be handled well by C++ vectors

Thanks!

Upvotes: 2

Views: 225

Answers (4)

463035818_is_not_an_ai
463035818_is_not_an_ai

Reputation: 122830

The limit due to the implementation is std::vector::max_size. For example here std::vector<float>::max_size() is 2305843009213693951.

However, thats just the theoretical limit due to constraints of the implementation. Much sooner than that you will hit the memory limit of your hardware.

A std::vector<float> does not use (substantially) more memory than a dynamic c-array.

Upvotes: 0

Persixty
Persixty

Reputation: 8589

The answer very much depends on your platform. You're talking about just over 150 MiB of data (a double is 8 bytes on practically all modern platforms). This code has no trouble on a 'toy' environment (https://ideone.com/A6GmvQ):

#include <iostream>
#include <vector>

void build(std::vector<double>& data, const size_t size){
    void* ptr{data.data()};
    for(size_t i{1};i<=size;++i){
        data.push_back(2*i);
        if(data.data()!=ptr){
            ptr=data.data();
            std::cout << i << ' ' << ptr << ' ' << data.capacity() << std::endl;
        }
    }
}

int main() {
    size_t size{100000000L};
    
    std::cout << ((size*sizeof(double))/1024/1024) << " MiB" << std::endl;
    
    std::vector<double> data{};
    
    build(data,size);
    
    double sum{0};
    for(auto curr : data){
        sum+=curr;
    }
    std::cout << sum << std::endl;
    return 0;
}

This code is knowingly dumb and doesn't even try to reserve capacity for the values (which can help) because std::vector<> helps with that anyway.

Behind the scenes the vector allocates a block of capacity and then re-allocates another larger capacity when the logical size of the vector exceeds the capacity. The code 'watches' the internal representation and outputs each re-allocation...

There are members for you to help with that capacity management if you're consuming the values as a stream (which sounds likely for audio).

The short answer is 'give it a go'. I've cranked that up to 100M doubles and have no issue.

Upvotes: 2

Surt
Surt

Reputation: 16109

std::vectorand co. were one of the reasons I changed form C to C++.

It takes all the management boilerplate out of array management.

When I need to resize an array allocation I would have to do the following

  1. allocate new memory
  2. copy the elements over
  3. delete the old memory

Also all lifetime management is handled by the std::vector no more messing around with delete at the end of the lifetime, makes handling multiple exit points in a function much easier.

Upvotes: 1

Asteroids With Wings
Asteroids With Wings

Reputation: 17454

This is a false comparison.

For a start, vectors use pointers. They have to. Vectors are containers that use dynamic allocation to provide you with a buffer of data items. You could try to implement the same thing "with pointers", but you'd end up with something someway between a vector, and a worse version of vector.

So, vectors can handle as much data as you'd be able to handle with new double[] — that is, a lot.

Upvotes: 6

Related Questions