Xebar Saram <zeltakc <at> gmail.com> writes:

> 
> Hi All,
> 
> I have a terrible issue i cant seem to debug which is halting my work
> completely. I have R 3.02 installed on a linux machine (arch linux-latest)
> which I built specifically for running high memory use models. the system
> is a 16 core, 256 GB RAM machine. it worked well at the start but in the
> recent days i keep getting errors and crashes regarding memory use, such as
> "cannot create vector size of XXX, not enough memory" etc
> 
> when looking at top (linux system monitor) i see i barley scrape the 60 GB
> of ram (out of 256GB)
> 
> i really don't know how to debug this and my whole work is halted due to
> this so any help would be greatly appreciated

  I'm very sympathetic, but it will be almost impossible to debug
this sort of a problem remotely, without a reproducible example.
The only guess that I can make, if you *really* are running *exactly*
the same code as you previously ran successfully, is that you might
have some very large objects hidden away in a saved workspace in a
.RData file that's being loaded automatically ...

  I would check whether gc(), memory.profile(), etc. give sensible results
in a clean R session (R --vanilla).

  Ben Bolker

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to