MOzSalles
MOzSalles

Reputation: 205

Why are library() calls hitting my R memory limit?

I'm working on a laravel app that connects to a R server via Rserve (using ctuberlin/rserve). The app is in a docker container and the R process is running on a separate container. Link to GitHub here.

The app allows us to make tasks and exercises in R that are corrected automatically. We have to limit resources for each user session so that faulty student code doesn't take out the whole docker container.

We are using unix::rlimit_as(1e9) as a memory limit.

Problem is, whenever the exercise requires some library, the library() call hits an error.

Example of errors:

r_1      | Error: package or namespace load failed for ‘png’ in dyn.load(file, DLLpath = DLLpath, ...):
r_1      |  unable to load shared object '/usr/lib/R/site-library/png/libs/png.so':
r_1      |   /usr/lib/R/site-library/png/libs/png.so: failed to map segment from shared object

Or testing manually inside the container (with docker exec):

> unix::rlimit_as(1e9)
$cur
[1] 1e+09

$max
[1] Inf

> library(RMySQL)
Loading required package: DBI
Error: package or namespace load failed for ‘DBI’:
 memory exhausted
Error: package ‘DBI’ could not be loaded

Removing the rlimit call or moving it to after the library call avoids this error.

I don't understand why this is happening, since if I run the exact same commands on my local system, it works fine. It could be an issue with the container configuration? I'm at a loss as to how to troubleshoot this.

Links to my R container Dockerfile and docker-compose.yml.

Upvotes: 0

Views: 48

Answers (0)

Related Questions