> I'm actually kind of surprised at the dissimilarity between the > normal and multinormal. I'd expect the multinormal to boil down to the > normal, but it looks like the standard normal has additional terms.
the multivariate normal has the 1-d normal as a special case, but instead of normalizing by the "area under the curve", you normalize by the determinant of the covariance matrix associated with how "squishy" in each pair of dimensions the distribution is. and the mean is a vector instead of a number, of course. turn the covariance matrix into a real number, and the vector into a real, and voila! the 1-d normal appears. randomly generating multivariate normals out of thin air (i.e. via the inverse wishart) is a little bit tricky (albeit quite fast, if you do it right). however, the cool thing about doing this is that you can converge to the 'actual' distribution fairly rapidly if you're willing to put up with some chicanery along the way. presume each state in your model of interest is represented by one of these multivariate gaussians, and that there is a transition matrix representing the probability of transitioning from one state to another, and you can magically converge to the transition matrix with enough sampling. then calculating state and transition are as fast as a single matrix multiplication. if you're trying to model something with few enough states (say, less than 100), this is incredibly effective with enough training data. i wonder what elements in go can be represented by a semi-reasonable number of distinct states that often lead from one to the next...? 10,000 foot view: http://en.wikipedia.org/wiki/Hidden_Markov_models s. ____________________________________________________________________________________ Do you Yahoo!? Everyone is raving about the all-new Yahoo! Mail beta. http://new.mail.yahoo.com _______________________________________________ computer-go mailing list computer-go@computer-go.org http://www.computer-go.org/mailman/listinfo/computer-go/