Nova
Nova

Reputation: 618

What is causing my "vector memory exhausted" error?

For context: I am running a large simulation that takes many hours and 30+ iterations. Everytime, seemingly randomly between 16 and 23 iterations, I get the following error:

enter image description here

I believe I've narrowed down the issue to the following code block (which is not surprising given that this is one of the slowest steps of the algorithm):


adj_mat <- dists_mat
  
  a <- (adj_mat <= cutoff)
  b <- (adj_mat > cutoff)
  
  adj_mat[a] <- TRUE
  adj_mat[b] <- FALSE

adj_mat is a very large 2,000 x 40,000 distance matrix. In the code block, I am trying to binarize all the cells of this matrix using a cutoff value. Step 1 is to assign the coordinates of all the 1 and 0 cells to the variables a and b, respectively. Step 2 is then to assign 1s and 0s to those coordinates in the matrix.

I just don't understand why this code works for 10+ iterations and then exhausts the vector memory. I'm not saving very much data within the function after each iteration.

Is there perhaps a more memory efficient/less computationally demanding way to binarize that matrix?

Upvotes: 0

Views: 116

Answers (2)

akrun
akrun

Reputation: 886938

We can use split and create a list of output in a single line

lst1 <- split(adj_mat, adj_mat <= cutoff)

Upvotes: 1

Ronak Shah
Ronak Shah

Reputation: 388807

I am not sure if this will help but you can reduce this to only two lines. This will avoid creation of all temporary variables and remove additional steps.

adj_mat <- dists_mat
adj_mat <- adj_mat <= cutoff

Upvotes: 1

Related Questions