Mel
Mel

Reputation: 750

"Can't allocate vector of size..." error even though memory.limit() and memory.size() are much higher

I want to run a random forest on my data e.g.

# fit a random forest model (using ranger)
rf_fit <- train(as.factor(y_variable) ~ ., 
                data = training_set, 
                method = "ranger")

which returns:

Error: cannot allocate vector of size 5.8 Gb

but

memory.limit()

returns:

[1] 20000

and

memory.size()

returns the slightly smaller value (but still bigger than 6GB) of:

[1] 18785.67

So why can't it allocate the memory?

Upvotes: 0

Views: 1295

Answers (1)

MattB
MattB

Reputation: 671

That error is the "last straw" of memory allocation. So even though you have 20GB of RAM, at the point where it needs to allocate another 6GB there is not enough available. That could be because you already have objects taking up space, or because the random forest model is allocating memory for several objects, the total of which is bigger than 20GB.

Upvotes: 1

Related Questions