Barry Rowlingson <b.rowlingson <at> lancaster.ac.uk> writes: > > On Wed, Sep 8, 2010 at 1:35 PM, Michael Bernsteiner > <dethlef1 <at> hotmail.com> wrote: > > > > Dear all, > > > > I'm optimizing a relatively simple function. Using optimize the optimized > > parameter value is worse than the starting. why?
I would like to stress here that finding a global minimum is not as much sorcery as this thread seems to suggest. A widely accepted procedure to provably identify a global minimum goes roughly as follows (see Chapt. 4 in [1]): - Make sure the global minimum does not lie 'infinitely' for out. - Provide estimations for the derivatives/gradients. - Define a grid fine enough to capture or exclude minima. - Search grid cells coming into consideration and compare. This can be applied to two- and higher-dimensional problems, but of course may require enormous efforts. In science and engineering applications it is at times necessary to really execute this approach. Hans Werner [1] F. Bornemann et al., "The SIAM 100-Digit Challenge", 2004, pp. 79. "In fact, a slightly finer grid search will succeed in locating the proper minimum; several teams used such a search together with estimates based on the partial derivatives of f to show that the search was fine enough to guarantee capture of the answer." > This looks familiar. Is this some 1-d version of the Rosenbrock > Banana Function? > > http://en.wikipedia.org/wiki/Rosenbrock_function > > It's designed to be hard to find the minimum. In the real world one > would hope that things would not have such a pathological behaviour. > > Numerical optimisations are best done using as many methods as > possible - see optimise, nlm, optim, nlminb and the whole shelf of > library books devoted to it. > > Barry > ______________________________________________ R-help@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code.