Nik D.
Nik D.

Reputation: 31

Optimum memory usage in R

I have been using 64 bit Windows machine. The RStudio as well as R is also 64 bit. I am using these on an EC2 instance which is r5.4xlarge. It has 16 core and about 128G memory. If I run command memory.limit() I see 100GB. Since , in the .RProfile file I have set this memory limit. Still when I use the RScript I see only 10GB memory being in use in Task Manager.

How should I make sure R uses optimum memory so that the script is run much more faster? If I run same script on my local with 64 GB RAM the script finishes in 5 minutes with 100% CPU usage but on EC2 it finishes in 15 minutes with only 25% CPU usage. Please let me know if additional information is required.

Upvotes: 0

Views: 153

Answers (1)

Carlos Santillan
Carlos Santillan

Reputation: 1087

I'm not sure that memory is the issue on this.

Since you note that the server only runs with 25% of CPU usage and 100% locally, it could that your code is parallelized locally and an not on the VM.

Another thing to look at is are you running Microsoft R Open locally? and not on the VM? R Open uses the Intel MKL (Math Kernel library) by default. which is a much faster implementation of the BLAS libraries.

sessionInfo()

for the standard R library

other attached packages:
[1] Matrix_1.2-12

and for R Open (something like )

other attached packages:
[1] RevoUtilsMath_10.0.0

Upvotes: 2

Related Questions