Hey, thanks for the "*COMPETENT PROGRAM EVOLUTION*" paper. I'll take a
closer look at that.

And I too think using reference material is a good start. The only clear
outline of the backprop algorithm I found was in this paper:
page.mi.fu-berlin.de/rojas/neural/chapter/K7.pdf. I haven't found a paper
or text with good datasets. If you know where I can get some other good
data sets, that have been used in a backprop system, I'm keen to get my
mitts on them :) The only other dataset I had my eye on was the sun spots
as seen in the encog Clojure
wrapper<https://github.com/jimpil/enclog/blob/master/src/java/PredictSunspotSVM.java>.
So
I think that'll help.

Thanks
Tim


On Mon, Nov 19, 2012 at 10:49 AM, Andreas Liljeqvist <bon...@gmail.com>wrote:

> If you have too many neurons or not a big enough dataset, you risk
> learning the features of the trainingset but not the generality of the
> problem.
>
> [e1 e2 answer]
> (def dataset[[1 1 2] [2 2 4] [4 4 8]])
>
> Are you learning addition here or a doubling function?
>
> If you have enough neurons, you could also more or less encode a lookup
> function instead of the logic behind it.
>  eg.
> (defn func [x] ({[1 1] 2 [2 2] 4} x)) ; yeah I would not look like this if
> I was encoded in a NN.
> instead of:
> (defn func [x] (apply + x))
>
> It will break down once it finds data not in the trainingset.
>
> About the validation of your implementation.
> Try to find a textbook where they are calculating a simple backprop
> network by hand.
> Steal their data and make unit tests from it.
>
>
>

-- 
You received this message because you are subscribed to the Google
Groups "Clojure" group.
To post to this group, send email to clojure@googlegroups.com
Note that posts from new members are moderated - please be patient with your 
first post.
To unsubscribe from this group, send email to
clojure+unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/clojure?hl=en

Reply via email to