Hello,

I have a general question on the possibility of how to "catch and stop" a 
function when it uses too much memory.

The problem is that some datasets, when applied to nlme (a relatively older 
version), cause the nlme function to just hang forever and start taking over 
memory (this afternoon one of those calls was about 40GB!) and not returning an 
answer. Other datasets work fine.

I am trying to debug nlme by varying its parameters but I have a general 
question in the interim. I have the following situation:

for i in (1:N) {
    dataset <- createDataset(i)
    try(nlme(dataset, otherParameters))
}

If one of those datasets starts using, say more than 2GB of memory I would like 
to just stop nlme, get an error, record it, and move on with the next dataset.  
Right now with some datasets nlme takes over the computer memory and the system 
ends up killing the entire process.

Any suggestions appreciated.

Thank you,

Ramiro

        [[alternative HTML version deleted]]

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to