On 05/05/2016 10:11, Uwe Ligges wrote:
On 05.05.2016 04:25, Marius Hofert wrote:
Hi Simon,
... all interesting (but quite a bit above my head). I only read
'Linux' and want to throw in that this problem does not appear on
Linux (it seems). I talked about this with Martin Maechler and he
reported that the same example (on one of his machines; with NA_real_
instead of '0's in the matrix) gave:
Error: cannot allocate vector of size 70.8 Gb
Timing stopped at: 144.79 41.619 202.019
... but no killer around...
Well, with n=1. ;-)
Actually this also happens under Linux and I had my R processes killed
more than once (and much worse also other processes so that we had to
reboot a server, essentially). That's why we use job scheduling on
servers for R nowadays ...
Yes, Linux does not deal safely with running out of memory, although it
is better than it was. In my experience, only commercial Unices do that
gracefully.
Have you tried setting a (virtual) memory limit on the process using the
shell it is launched from? I have found that to be effective on most
OSes, at least in protecting other processes from being killed.
However, some things do reserve excessive amounts of VM that they do not
use and so cannot be run under a sensible limit.
--
Brian D. Ripley, rip...@stats.ox.ac.uk
Emeritus Professor of Applied Statistics, University of Oxford
______________________________________________
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel