Reputation: 11
I'm working on doing some species distribution modeling in R (package ENMeval 2.0.4) with a large stacked raster of environmental parameters (105gb) covering the Northeast of the US, on a laptop with 32gb of ram. I've worked through issues in most of my code, however when R runs through to the point of actually running the model, I get this error: "Cannot allocate vector size of 142.1 gb."
BLSS_sdm <- ENMevaluate(taxon.name = sppName,
occs = sf::st_coordinates(SppLocs),
envs = env_var,
categoricals = c("nlcd","US_L4CODE","successional"),
bg = bkgrd.samps,
algorithm = 'maxent.jar',
partitions = "block",
other.settings = list(abs.auc.diff = FALSE,
pred.type = "cloglog",
validation.bg = "partition"),
tune.args = list(fc = "L","Q"
)),
overlap = FALSE,
raster.preds = FALSE,
rmm = base_rmm)
I tried changing my temp directory to my external hard drive, which should have enough bytes, but now I get the same error, but the vector size has doubled: "Cannot allocate vector size of 284.1 gb.
I saw ENMeval 2.0.5 allows for removing raster predictions (which I have later in my code as well) so I updated and tried again when removing raster predictions.
My job is willing to get me a computer with more RAM, but is it even reasonable? Would a 64 gb of RAM work station even be able to handle it?
My boss and I have also talked about maybe extracting raster values to decrease the size, or removing some variables, but I'd prefer not to unless there's no other option.
So long story short;
Thanks in advance for any help or advice you might be able to give.
Upvotes: 1
Views: 40