user29798977
user29798977

Reputation: 11

Is there a way around allocating this error "cannot allocate vector of size XX gb?" for SDM in R?

I'm working on doing some species distribution modeling in R (package ENMeval 2.0.4) with a large stacked raster of environmental parameters (105gb) covering the Northeast of the US, on a laptop with 32gb of ram. I've worked through issues in most of my code, however when R runs through to the point of actually running the model, I get this error: "Cannot allocate vector size of 142.1 gb."

BLSS_sdm <- ENMevaluate(taxon.name = sppName,
              occs = sf::st_coordinates(SppLocs),
              envs = env_var,
              categoricals = c("nlcd","US_L4CODE","successional"),
              bg = bkgrd.samps,
              algorithm = 'maxent.jar',
              partitions = "block", 
              other.settings = list(abs.auc.diff = FALSE,
                                    pred.type = "cloglog",
                                    validation.bg = "partition"),
              tune.args = list(fc = "L","Q"
              )),
              overlap = FALSE,
              raster.preds = FALSE,
              rmm = base_rmm)

I tried changing my temp directory to my external hard drive, which should have enough bytes, but now I get the same error, but the vector size has doubled: "Cannot allocate vector size of 284.1 gb.

I saw ENMeval 2.0.5 allows for removing raster predictions (which I have later in my code as well) so I updated and tried again when removing raster predictions.

My job is willing to get me a computer with more RAM, but is it even reasonable? Would a 64 gb of RAM work station even be able to handle it?

My boss and I have also talked about maybe extracting raster values to decrease the size, or removing some variables, but I'd prefer not to unless there's no other option.

So long story short;

  1. Has this been a problem for those of you that may have been using ENMeval on large areas;
  2. Do you have any code suggestions for fixing it;
  3. If not that, you have any data management (i.e., downsizing area/layers) suggestions for handling it, and;
  4. If anyone is comfortable speaking on computing requirements, do you have any idea if a larger RAM work station would be able to handle this? This is just the first of many SDMS, so some may be larger or smaller, as well.

Thanks in advance for any help or advice you might be able to give.

Upvotes: 1

Views: 40

Answers (0)

Related Questions