Reputation: 11813
I have an application that does some Monte Carlo simulation. For each run, a 12MB file is loaded into a std::vector<MyData>
. The object which loads and stores the data is referenced by a boost::shared_ptr
which is removed from the stack when the run finishes.
I see the memory usage of the application grow in Windows Task Manager to about 1GB (after 80-90 runs), but it then usually drops down to 50MB (and start growing for the next runs). So I wonder if this is a memory leak or just normal behavior. Should/Could I do anything to explicitly free the memory in the vector or something else?
Thanks for hints,
Philipp
Upvotes: 0
Views: 478
Reputation: 11813
Thanks everybody for your hints. It turned out that it actually WAS a memory leak caused a lacking virtual destructor of my AbstractSensorDataSource class which was storing the loaded data.
Upvotes: 1
Reputation: 19181
That actually sounds about right.
90 * 12 = 1080 MB = 1.0546875 GB
You should consider using another allocator or reducing the number of rounds.
If you would like to release the memory explicitly you should either use a regular pointer or call the reset()
function of shared_ptr.
Use a profiler, as others have mentioned to see if another allocator actually has a positive impact on memory allocation.
Upvotes: 0