Re: [Computer-go] Weak bots to run on CGOS

2015-03-20 Thread Urban Hafner
So, I now have a new version of my bot running on CGOS ( http://cgos.boardspace.net/13x13/cross/Imrscl-016-AMAF.html). It's still considerably weaker than GnuGo so I'm pretty sure it will loose all games against it. However, it's now much stronger than any other bot running on CGOS and I guess it w

Re: [Computer-go] Weak bots to run on CGOS

2015-03-20 Thread Christoph Birk
On Mar 20, 2015, at 5:11 AM, Urban Hafner wrote: > So, I now have a new version of my bot running on CGOS > (http://cgos.boardspace.net/13x13/cross/Imrscl-016-AMAF.html). It's still > considerably weaker than GnuGo so I'm pretty sure it will loose all games > against it. However, it's now much

Re: [Computer-go] Weak bots to run on CGOS

2015-03-20 Thread Urban Hafner
Thanks Christoph! On Fri, Mar 20, 2015 at 4:17 PM, Christoph Birk < b...@obs.carnegiescience.edu> wrote: > > On Mar 20, 2015, at 5:11 AM, Urban Hafner wrote: > > So, I now have a new version of my bot running on CGOS ( > http://cgos.boardspace.net/13x13/cross/Imrscl-016-AMAF.html). It's still >

Re: [Computer-go] Teaching Deep Convolutional Neural Networks to Play Go

2015-03-20 Thread Hugh Perkins
On 3/17/15, David Silver wrote: > Reinforcement learning is different to unsupervised learning. We used > reinforcement learning to train the Atari games. Also we published a more > recent paper (www.nature.com/articles/nature14236) that applied the same > network to 50 different Atari games (achi

Re: [Computer-go] Representing Komi for neural network

2015-03-20 Thread Hugh Perkins
On 1/12/15, Álvaro Begué wrote: > A CNN that starts with a board and returns a single number will typically > have a few fully-connected layers at the end. You could make the komi an > extra input in the first one of those layers, or perhaps in each of them. That's an interesting idea. But then,

[Computer-go] Fwd: Representing Komi for neural network

2015-03-20 Thread Hugh Perkins
> But then, the komi wont really participate in the hierarchical representation we are hoping that the network will build, that I suppose we are hoping is the key to obtaining human-comparable results? Well... it seems that Hinton, in his dropout paper http://arxiv.org/pdf/1207.0580.pdf , get kin

[Computer-go] Fwd: Representing Komi for neural network

2015-03-20 Thread Hugh Perkins
> Perhaps what we want is a compromise between convnets and fcs though? ie, either take an fc and make it a bit more sparse, and / or take an fc and randomly link sets of weights together??? Maybe something like: each filter consists of eg 16 weights, which are assigned randomly over all input-out

Re: [Computer-go] Representing Komi for neural network

2015-03-20 Thread Álvaro Begué
On Fri, Mar 20, 2015 at 8:24 PM, Hugh Perkins wrote: > On 1/12/15, Álvaro Begué wrote: > > A CNN that starts with a board and returns a single number will typically > > have a few fully-connected layers at the end. You could make the komi an > > extra input in the first one of those layers, or p

Re: [Computer-go] Representing Komi for neural network

2015-03-20 Thread Hugh Perkins
On Sat, Mar 21, 2015 at 11:41 AM, Álvaro Begué wrote: > I don't see why komi needs to participate in the hierarchical representation > at all. Yes, fair point. I guess I was taking 'komi' as an example of any additional natural number that one might wish to feed into a net. But youre right, in t