Jason House wrote:


On Dec 12, 2007 3:09 PM, Álvaro Begué <[EMAIL PROTECTED] <mailto:[EMAIL PROTECTED]>> wrote:



    On Dec 12, 2007 3:05 PM, Jason House <[EMAIL PROTECTED]
    <mailto:[EMAIL PROTECTED]>> wrote:



        On Dec 12, 2007 2:59 PM, Rémi Coulom
        <[EMAIL PROTECTED]
        <mailto:[EMAIL PROTECTED]>> wrote:

            > Do you mean a plot of the prediction rate with only the
            > gamma of interest varying?

            No the prediction rate, but the probability of the
            training data. More
precisely, the logarithm of that probability.

        I still don't know what you mean by this.


    He probably should use the word "likelihood" instead of
    "probability". http://en.wikipedia.org/wiki/Likelihood_function


Clearly I'm missing something, because I still don't understand. Let's take a simple example of a move is on the 3rd line and has a gamma value of 1.75. What is the equation or sequence of discrete values that I can take the derivative of?

Doing conditional probabilities based on "move is on 3rd line" and "move is selected" (AKA pure training data) seems to yield a fixed value rather than something approximating a normal distribution.

Consider, in the Elo rating analogy, a player with a win and a loss to player whose gamma is 1. There you have P(gamma)=gamma/(1+gamma)², whose maximum is at gamma = 1. It is that probability that I am talking about. It is the probability that is maximized by the MM algorithm.

Rémi
_______________________________________________
computer-go mailing list
computer-go@computer-go.org
http://www.computer-go.org/mailman/listinfo/computer-go/

Reply via email to