Reputation: 583
I have a very large list of dataframes (300 dataframes, each with 2 columns and 300~600 rows), and I want to join all of them with
final <- subset %>% reduce(full_join, by = "Frame_times")
When I try to do this, however, I get the following error:
Error: cannot allocate vector of size 265.6 Mb"
I am operating on 64-bit Windows 10 with the latest installation of 64-bit R (4.0.0). I have 8gb of RAM, and
> memory.limit()
[1] 7974
> memory.size(max = TRUE)
[1] 7939.94
I have also tried the gc() function, but it did not help.
It appears that I have enough space and memory to run this, so why am I getting this error? And how can I fix it?
Thank you very much!
Upvotes: 0
Views: 428
Reputation: 671
You are running out of RAM. A first step to troubleshooting might be to first run this code on a smaller subset of dataframes (say, 3). Are the results (in particular, the number of rows) what you were expecting? If yes and it's really doing the right thing, then it might help to do it in batches (say 5 batches of 100). It sounds like the mostly likely scenario is that for some reason the number of rows or columns is blowing up to a much bigger number than you're expecting.
The 266Mb mentioned in the error is just the final straw; not the total memory you're using.
Upvotes: 1