Reputation: 2687
I am running my code in R (under Windows) which involves a lot of in-memory data. I tried to use rm(list=ls())
to clean up memory, but seems the memory is still occupied and I cannot rerun my code. I tried to close the R and restart R again, but it is the same. It seems that memory is still occupied, as when I run the code, it says it can't allocate memory (but it could at the first time). The memory only seems to get cleaned up after I restart my PC.
Is there any way to clean up the memory so that I can rerun my code without restarting my PC every time?
Upvotes: 85
Views: 206568
Reputation: 1
Launch Terminal, check location of previously saved R objects:
ls -a
Then, remove previously saved data:
rm -rf .RData
rm -rf .RDataTmp
Lauch R Studio, previously saved objects shouldn't appear under Environment tab.
you could also try gc()
on terminal.
Upvotes: 0
Reputation: 21
There is only so much you can do with rm() and gc(). As suggested by Gavin Simpson, even if you free the actual memory in R, Windows often won't reclaim it until you close R or it is needed because all the apparent Windows memory fills up.
This usually isn't a problem. However, if you are running large loops this can sometimes lead to fragmented memory in the long term, such that even if you free the memory and restart R - the fragmented memory may prevent you allocating large chunks of memory. Especially if other applications were allocated fragmented memory while you were running R. rm() and gc() may delay the inevitable, but more RAM is better.
Upvotes: 2
Reputation: 393
Just adding this for reference in case anybody needs to restart and immediatly run a command.
I'm using this approach just to clear RAM from the system. Make sure you have deleted all objects no longer required. Maybe gc()
can also help before hand. But nothing will clear RAM better as restarting the R session.
library(rstudioapi)
restartSession(command = "print('x')")
Upvotes: 4
Reputation: 399
I've found it helpful to go into my "tmp" folder and delete all hanging rsession files. This usually frees any memory that seems to be "stuck".
Upvotes: 0
Reputation: 2642
Use ls()
function to see what R objects are occupying space. use rm("objectName") to clear the objects from R memory that is no longer required. See this too.
Upvotes: 13
Reputation: 654
I came under the same problem with R. I dig a bit and come with a solution, that we need to restart R session to fully clean the memory/RAM. For this, you can use a simple code after removing everything from your workspace. the code is as follows :
rm(list = ls())
.rs.restartR()
Upvotes: 41
Reputation: 456
memory.size(max=T) # gives the amount of memory obtained by the OS
[1] 1800
memory.size(max=F) # gives the amount of memory being used
[1] 261.17
Using Paul's example,
m = matrix(runif(10e7), 10000, 1000)
Now
memory.size(max=F)
[1] 1024.18
To clear up the memory
gc()
memory.size(max=F)
[1] 184.86
In other words, the memory should now be clear again. If you loop a code, it is a good idea to add a gc()
as the last line of your loop, so that the memory is cleared up before starting the next iteration.
Upvotes: 8
Reputation: 3203
Maybe you can try to use the function gc()
. A call of gc()
causes a garbage collection to take place. It can be useful to call gc()
after a large object has been removed, as this may prompt R to return memory to the operating system.
gc()
also return a summary of the occupy memory.
Upvotes: 103
Reputation: 60924
An example under Linux (Fedora 16) shows that memory is freed when R is closed:
$ free -m
total used free shared buffers cached
Mem: 3829 2854 974 0 344 1440
-/+ buffers/cache: 1069 2759
Swap: 4095 85 4010
2854 megabytes is used. Next I open an R session and create a large matrix of random numbers:
m = matrix(runif(10e7), 10000, 1000)
when the matrix is created, 3714 MB is used:
$ free -m
total used free shared buffers cached
Mem: 3829 3714 115 0 344 1442
-/+ buffers/cache: 1927 1902
Swap: 4095 85 4010
After closing the R session, I nicely get back the memory I used (2856 MB free):
$ free -m
total used free shared buffers cached
Mem: 3829 2856 972 0 344 1442
-/+ buffers/cache: 1069 2759
Swap: 4095 85 4010
Ofcourse you use Windows, but you could repeat this excercise in Windows and report how the available memory develops before and after you create this large dataset in R.
Upvotes: 3