More to provide another perspective, I'll give the citation of some work
with Harry Joe and myself from over 2 decades ago.

@Article{,
  author  = {Joe, Harry and Nash, John C.},
  title   = {Numerical optimization and surface estimation with imprecise 
function evaluations},
  journal = {Statistics and Computing},
  year    = {2003},
  volume  = {13},
  pages   = {277--286},
}

Essentially this fits a quadratic approximately by regression, assuming the 
returned
objective is imprecise. It is NOT good for high dimension, of course, and is 
bedeviled
by needing to have some idea of the scale of the imprecision i.e., the noise
amplitude. However, it does work for some applications. Harry had some success 
with
Monte Carlo evaluation of multidimensional integrals optimizing crude 
quadratures.
That is, multiple crude quadrature could be more efficient than single precise 
quadrature.
However, this approach is not one that can be blindly applied. There are all 
sorts
of issues about what point cloud to keep as the "fit model, move to model 
minimum,
add points, delete points" process evolves.


JN

On 2023-08-13 15:28, Hans W wrote:
While working on 'random walk' applications, I got interested in
optimizing noisy objective functions. As an (artificial) example, the
following is the Rosenbrock function, where Gaussian noise of standard
deviation `sd = 0.01` is added to the function value.

     fn <- function(x)
               (1+rnorm(1, sd=0.01)) * adagio::fnRosenbrock(x)

To smooth out the noise, define another function `fnk(x, k = 1)` that
calls `fn` k times and returns the mean value of those k function
applications.

     fnk <- function(x, k = 1) {     # fnk(x) same as fn(x)
         rv = 0.0
         for (i in 1:k) rv <- rv + fn(x)
         return(rv/n)
     }

When we apply several optimization solvers to this noisy and smoothed
noise functions we get for instance the following results:
(Starting point is always `rep(0.1, 5)`, maximal number of iterations 5000,
  relative tolerance 1e-12, and the optimization is successful if the
function value at the minimum is below 1e-06.)

       k   nmk       anms neldermead     ucminf optim_BFGS
      ---------------------------------------------------
       1  0.21       0.32       0.13       0.00       0.00
       3  0.52       0.63       0.50       0.00       0.00
      10  0.81       0.91       0.87       0.00       0.00

Solvers: nmk = dfoptim::nmk, anms = pracma::anms [both Nelder-Mead codes]
          neldermead = nloptr::neldermead,
          ucminf = ucminf::ucminf, optim_BFGS = optim with method "BFGS"

Read the table as follows: `nmk` will be successful in 21% of the
trials, while for example `optim` will never come close to the true
minimum.

I think it is reasonable to assume that gradient-based methods do
poorly with noisy objectives, though I did not expect to see them fail
so clearly. On the other hand, Nelder-Mead implementations do quite
well if there is not too much noise.

In real-world applications, it will often not be possible to do the
same measurement several times. That is, we will then have to live
with `k = 1`. In my applications with long 'random walks', doing the
calculations several times in a row will really need some time.

QUESTION: What could be other approaches to minimize noisy functions?

I looked through some "Stochastic Programming" tutorials and did not
find them very helpful in this situation. Of course, I might have
looked into these works too superficially.

______________________________________________
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

______________________________________________
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to