> -----Original Message----- > From: Cenny Wenner <[EMAIL PROTECTED]> > To: computer-go <computer-go@computer-go.org> > Sent: Wed, 19 Sep 2007 11:31 am > Subject: Re: [computer-go] Re: Most common 3x3 patterns >
> > On 9/19/07, [EMAIL PROTECTED] <[EMAIL PROTECTED]> wrote: > > I neglected the rather important detail that these patterns are trained on > 9x9 games. Training on 19x19 games produces different scores than these. > > I've tried it both ways (it's much easier to get a large set of 19x19 games > > for training) and this set is the one I now use for both 9x9 and 19x19. But > > my program's performance on 19x19 is terrible either way. > > > > IIRC, if I train on 19x19 games, but only keep track of patterns with a > > center within the 5x5 window around the enemy's previous move, then I get > > scores very similar to those from 9x9 games. > Care to elaborate on what you mean by scores here and how they are > similar to the 9x9 equivalence? OK. There are many ways to derive a score for a pattern and, naturally, it matters what you're going to use it for. For these patterns, I calculated the score in the obvious way that everyone thinks of first. I took a file with ~22,000 9x9 games from the NNGS. Most of these games were unusable, so I automatically edited most of them out and only actually used those from low KYU or Dan level players where there weren't any obvious problems in the game record, like illegal moves or absurd handicaps. I added a couple hundred pro games to the set. For every move, where the player didn't pass, in every game, I trained the patterns. I (my program of course) looked at every legal move, checked the 3x3 pattern around it and incremented the counter "hits" for that pattern. If the same pattern showed up 5 times on the board for one turn, I incremented "hits" 5 times. Then, I looked at the pattern around the move the player actually made, and incremented the counter "moves" one time for that pattern. (By my convention, it is always white's turn to move, so I mirrored the colors when needed.) After training, the score for each pattern was (100 * moves)/hits. That produces a score between 0 and 100. Because it's difficult to collect a lot of decent 9x9 games by human players, many of the patterns have noisy scores based on very few samples. It's relatively easy to collect 19x19 game records, but their scores, calculated this way, will be fairly different. In some experiments, I used a large set of 19x19 games but only incremented the counters for patterns a small distance away from the opponent's previous move. When I say these scores are similar, I mean that they tended to be within a few points of each other when there was a reasonable sample size for the patterns. For noisy cases, they were at least consistent: the 9x9 games might have a score of 84 based on a sample size (number of hits for the pattern) of only 19 while the 19x19 games gave a (still very high) score of 46 based on 1019 samples. There are much more sophisticated ways to calculate a score. I've used this way for different things. How am I using it now? AntIgo's heavy Monte-Carlo playouts are derived from the description in the first?Mogo paper. If the MC player can't find a move to rescue a threatened group, it looks for any moves with a "good" pattern close to the foe's previous move. If it finds any, it chooses one at random. My threshold for a "good" pattern is one with a score >= 2, so my score is essentially a binary one. "Measured with a micrometer, marked with a piece of chalk, cut with an axe." - Dave Hillis ? ________________________________________________________________________ Check Out the new free AIM(R) Mail -- Unlimited storage and industry-leading spam and email virus protection.
_______________________________________________ computer-go mailing list computer-go@computer-go.org http://www.computer-go.org/mailman/listinfo/computer-go/