Dear R users

I have the following problem when calling optim() to minimize a function 
"f.obj" (= outer loop) that calls another function "f.con" (a contraction 
mapping, = inner loop).  It seems to me that it is a numerical problem that I 
currently fail to take into account when coding.

Calling optim(), for the first few iterations of the outer loop, everything 
seems fine; the contraction mapping is calculated in each run.  However, after 
a number of outer loop iterations, an error occurs and the following message is 
displayed:

Error in while (max.dev >= tol.in) { :
missing value where TRUE/FALSE needed

The previous conditional statement ensures that the iteration in the inner loop 
should run as long as max.dev <- max(abs(x - x.in)) is greater than the inner 
loop tolerance level (tol.in <- 1E-9), where x is computed by the contraction 
mapping using x.in.  I used different stopping rules and tolerance levels, but 
this gives the same result.

As I said, I think it's a numerical problem.  Has anyone had similar 
experiences using optim() and could you give some coding advice?

Thanks in advance,

Jo Reynaerts

        [[alternative HTML version deleted]]

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to