Reputation: 7
I am running a piece of code in R. Its parallelized, running in 8 cores. Interestingly enough, when my memory usage reaches 15 and something GB, it drops to 10GB (my max memory is 16GB). I am curious of what is actually happening in the background? In the end, I get the complete data from all 8 cores, so I assume that data doesn't get lost. Does the pc stores it somewhere in SSD to free memory?
For more information, I loop over a time series data and perform a lot calculations, which I store in multiple vectors. When code finishes looping, it stores all the previous vectors in a list. While running code, if I start opening many chrome tabs, which require a lot of memory, my code running time may take longer but still retrieves all data (sometimes crashes).
Very curious of what is happening?
Upvotes: 0
Views: 39
Reputation: 1279
It's impossible to say without the specific code, but most likely, it's due to R's garbage collection running only when necessary and only when more memory needs to be allocated - unlike other languages like Python, R does not immediately garbage-collect objects when they reach out of scope, and in particular if the R objects have an underlying pointer to a C/C++ object, garbage collection can he held out until very late after the object is unreachable.
If this variable memory usage is a problem, you can try adding explicit calls to gc()
at key points in your code.
Upvotes: 1
Reputation: 2096
Yes, you are right pc sometimes usage the hard disk as memory. it is known as Swap memory. When your ram gets overloaded it sends some of the data to the hard and stores them there temporarily.
Upvotes: 0