On Tue, May 23, 2017 at 4:51 AM, Hideki Kato <hideki_ka...@ybb.ne.jp> wrote:

> (3) CNN cannot learn exclusive-or function due to the ReLU
> activation function, instead of traditional sigmoid (tangent
> hyperbolic).  CNN is good at approximating continuous (analog)
> functions but Boolean (digital) ones.
>

Oh, not this nonsense with the XOR function again.

You can see a neural network with ReLU activation function learning XOR
right here: http://playground.tensorflow.org/#activation=relu&;
batchSize=10&dataset=xor&regDataset=reg-plane&learningRate=0.01&
regularizationRate=0&noise=0&networkShape=4,4&seed=0.96791&
showTestData=false&discretize=false&percTrainData=50&x=true&
y=true&xTimesY=false&xSquared=false&ySquared=false&cosX=
false&sinX=false&cosY=false&sinY=false&collectStats=false&
problem=classification&initZero=false&hideText=false

Enjoy,
Álvaro.
_______________________________________________
Computer-go mailing list
Computer-go@computer-go.org
http://computer-go.org/mailman/listinfo/computer-go

Reply via email to