Looks like you're making good progress. Apart from the time gained training, 
you'll probably get a similar speed up when using the DNN during play? I'm 
curious when you'll see improvement in play outweigh the extra computational 
cost.
Mark

> On Apr 26, 2016, at 9:55 PM, David Fotland <fotl...@smart-games.com> wrote:
> 
> I have my deep neural net training setup working, and it's working so well I
> want to share.  I already had Caffe running on my desktop machine (4 core
> i7) without a GPU, with inputs similar to AlphaGo generated by Many Faces
> into an LMDB database.  I trained a few small nets for a day each to get
> some feel for it.
> 
> I bought an Alienware Area 51 from Dell, with two GTX 980 TI GPUs, 16 GB of
> memory, and 2 TB of disk.  I set it up to dual boot Ubuntu 14.04, which made
> it trivial to get the latest caffe up and running with CUDNN.  2 TB of disk
> is not enough.  I'll have to add another drive.
> 
> I expected something like 20x speedup on training, but I was shocked by what
> I actually got.
> 
> On my desktop, the Caffe MNIST sample took 27 minutes to complete.  On the
> new machine it was 22 seconds.  73x faster.
> 
> My simple network has 42 input planes, and 4 layers of 48 filters each.
> Training runs about 100x faster on the Alienware.  Training 100k Caffe
> iterations (batches) of 50 positions takes 13 minutes, rather than almost a
> full day on my desktop.
> 
> David
> 
> _______________________________________________
> Computer-go mailing list
> Computer-go@computer-go.org
> http://computer-go.org/mailman/listinfo/computer-go
_______________________________________________
Computer-go mailing list
Computer-go@computer-go.org
http://computer-go.org/mailman/listinfo/computer-go

Reply via email to