Hi,

I'm considering CLOP to be one of the compared optimizer in RobustOptimizer
https://github.com/ChinChangYang/RobustOptimizer/issues/68. However, I
have some questions to your experiment.

The CLOP is for noisy black-box parameter tuning. However, your test
functions (LOG, FLAT, POWER, ANGLE, and STEP) are noise-free functions as
shown in Table 1. It is very difficult to prove that CLOP can work very
well on noisy functions.

I suggest that the problem definition f(x) = 1/(1+exp(-r(x))) should be
perturbed with some random variables with a defined zero-mean distribution,
such as Gaussian distribution, uniform distribution, or any others.
Specifically, the problem definitions can be g(x) = 1/(1+exp(-r(x) + n(x)))
where n(x) is an additional noise. The performance of the algorithms can be
evaluated in terms of solution error measure, which is defined as f(x) -
g(x*) where x* is the global optimum of the noise-free function f.

BBOB 2012 defines some noisy functions
http://coco.gforge.inria.fr/doku.php?id=bbob-2012 which may also provide
confident performance evaluation for noisy optimization.

There may exist more appropriate performance evaluation methods than
aforementioned ones for win/loss outcomes. Anyway, in this paper, the
experiment uses noise-free functions as test functions. It cannot prove
anything for noisy optimization.

Best regards,
Chin-Chang Yang, 2013/03/06

2011/9/1 Rémi Coulom <[email protected]>

> Hi,
>
> This is a draft of the paper I will submit to ACG13.
>
> Title: CLOP: Confident Local Optimization for Noisy Black-Box Parameter
> Tuning
>
> Abstract: Artificial intelligence in games often leads to the problem of
> parameter tuning. Some heuristics may have coefficients, and they should be
> tuned to maximize the win rate of the program. A possible approach consists
> in building local quadratic models of the win rate as a function of program
> parameters. Many local regression algorithms have already been proposed for
> this task, but they are usually not robust enough to deal automatically and
> efficiently with very noisy outputs and non-negative Hessians. The CLOP
> principle, which stands
> for Confident Local OPtimization, is a new approach to local regression
> that overcomes all these problems in a simple and efficient way. It
> consists in discarding samples whose estimated value is confidently
> inferior to the mean of all samples. Experiments demonstrate that, when the
> function to be optimized is smooth, this method outperforms all other
> tested algorithms.
>
> pdf and source code:
> http://remi.coulom.free.fr/CLOP/
>
> Comments, questions, and suggestions for improvement are welcome.
>
> Rémi
> _______________________________________________
> Computer-go mailing list
> [email protected]
> http://dvandva.org/cgi-bin/mailman/listinfo/computer-go
>



-- 
Chin-Chang Yang
_______________________________________________
Computer-go mailing list
[email protected]
http://dvandva.org/cgi-bin/mailman/listinfo/computer-go

Reply via email to