We are running R in a linux cluster environment. The head node has had a few hangs when a user has inadvertently taken all the memory using an R process. Is there a way to limit R memory usage under linux? I'd rather not suggest global ulimits, but that may be the only way forward.
2 Answers
There's unix::rlimit_as() that allows setting memory limits for a running R process using the same mechanism that is also used for ulimit in the shell. Windows and macOS not supported.
In my .Rprofile I have
unix::rlimit_as(1e12, 1e12)
to limit memory usage to ~12 GB.
Before that...
I had created a small R package, ulimit with similar functionality.
Install it from GitHub using
devtools::install_github("krlmlr/ulimit")
To limit the memory available to R to 2000 MiB, call:
ulimit::memory_limit(2000)
Now:
> rep(0L, 1e9)
Error: cannot allocate vector of size 3.7 Gb
8 Comments
memory.limiit() at their disposal. My first objective was to get it up and running for my system...?"Memory-limits" suggests using ulimit or limit.
There is a command line flag: --max-mem-size which can set the initial limit. This can be increased by the user during the session by using memory.limit.
rsession-memory-limit-mb=4000to/etc/rstudio/rserver.confulimitworks fine until you want to use all your cores.