18

We are running R in a linux cluster environment. The head node has had a few hangs when a user has inadvertently taken all the memory using an R process. Is there a way to limit R memory usage under linux? I'd rather not suggest global ulimits, but that may be the only way forward.

4
  • 1
    I had problems with this before too (link), which might be related to your problem. The solution we ended up with was to entirely disable memory overcommiting on the machine. It is a blunt solution but has worked fine. Commented Sep 25, 2012 at 12:39
  • 1
    If, by chance, you use RStudio server, you can set user limits by adding a line like rsession-memory-limit-mb=4000 to /etc/rstudio/rserver.conf Commented Sep 25, 2012 at 12:57
  • 1
    is this unix.stackexchange.com/questions/44985/… useful? (i.e., not an R-specific approach, but if you can come up with a generic per-process solution that works on your OS, then you can set up an alias for R that imposes it ... Seems like this github.com/pshved/timeout would be particularly useful Commented Sep 25, 2012 at 13:23
  • ulimit works fine until you want to use all your cores. Commented Sep 25, 2012 at 14:51

2 Answers 2

18

There's unix::rlimit_as() that allows setting memory limits for a running R process using the same mechanism that is also used for ulimit in the shell. Windows and macOS not supported.

In my .Rprofile I have

unix::rlimit_as(1e12, 1e12)

to limit memory usage to ~12 GB.

Before that...

I had created a small R package, ulimit with similar functionality.

Install it from GitHub using

devtools::install_github("krlmlr/ulimit")

To limit the memory available to R to 2000 MiB, call:

ulimit::memory_limit(2000)

Now:

> rep(0L, 1e9)
Error: cannot allocate vector of size 3.7 Gb
Sign up to request clarification or add additional context in comments.

8 Comments

As you say on GitHub, that'll work on two of the three OSs only and most newbs work on the third. May be worthwhile noting somewhere here ...
@DirkEddelbuettel: Good point. Windows users seem to have memory.limiit() at their disposal. My first objective was to get it up and running for my system...
@krlmlr please explain how this is necessary given that mran.revolutionanalytics.com/.is available. Is it complementing it in some way?
Any plans to put this on CRAN?
@MichaelChirico: Not for now, planning to somehow merge it with RAppArmor which offers more of the ulimit API but has other drawbacks.
|
8

?"Memory-limits" suggests using ulimit or limit.

There is a command line flag: --max-mem-size which can set the initial limit. This can be increased by the user during the session by using memory.limit.

1 Comment

Thanks, James. --max-mem-size is now gone from R and memory.limit only applies on Windows. ulimit and limit look like the only way to go.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.