Hey All,

Just following up this post, as I'm trying to figure out how to calculate
an error's partial derivative in ANN. I'm constructing a feed-forward
artificial neural network <https://github.com/twashing/nn>, using resilient
propagation training. At the moment, I'm trying to implement an individual
neuron input's weight update algorithm. For the life of me, I can't seem to
find a clear and straightforward answer on how to calculate the partial
derivative of the error for a given weight. The only thing I can find on
the web, is the fact that a neuron's weight update is a function of dE/dW.
So see the original
Paper<http://paginas.fe.up.pt/~ee02162/dissertacao/RPROP%20paper.pdf>(p.
2 & 3). Or this
one<http://ejournals.uofk.edu/index.php/engineering/article/download/115/126>(p.
4).

But noone actually outlines how to calculate this. I understand the concept
of a partial derivative in a mathematical
sense<http://answers.yahoo.com/question/index?qid=20080506020512AARuMSB>.
And I assume that the current neuron input's weight change calculation is
the operation at hand, while all other neuron input values are held
constant. I've also dug through the encog source
code<https://github.com/encog/encog-java-core/blob/master/src/main/java/org/encog/neural/networks/training/propagation/resilient/ResilientPropagation.java>,
and can't find a clear explanation.

So for each of these neurons below, I calculate each inputs' individual
error by taking a total error ( *-0.3963277746392987* ), that's been
multiplied by that neuron input's weight ( each :calculated-error is the
sum of the individual inputs' error ). For both neurons, what would be the
weight change for each input?


:input-layer


 ({:calculated-error -1.0991814559154283,


   :calculated-value 0.9908633780805893,


   :inputs


   ({:error -0.07709937922001887,


     :calculated 0.4377023624017325,


     :key :avolume,


     :value 2.25,


     :weight 0.19453438328965889,


     :bias 0}


    {:error -0.19625185888745333,


     :calculated 1.4855269156904067,


     :key :bvolume,


     :value 3.0,


     :weight 0.4951756385634689,


     :bias 0}


    {:error -0.3072203938672436,


     :calculated 1.0261589301119642,


     :key :ask,


     :value 1.32379,


     :weight 0.7751674586693994,


     :bias 0}


    {:error -0.36920086975057054,


     :calculated 1.2332848282147972,


     :key :bid,


     :value 1.3239,


     :weight 0.9315543683169403,


     :bias 0}


    {:error -0.14940895419014188,


     :calculated 0.5036129016361643,


     :key :time,


     :value 1.335902400676,


     :weight 0.37698330460468044,


     :bias 0}),


   :id "583c10bfdbd326ba525bda5d13a0a894b947ffc"},
  ...)

:output-layer


({:calculated-error -1.1139741279964241,


  :calculated-value 0.9275622253607013,


  :inputs


  ({:error -0.2016795955938916,


    :calculated 0.48962608882549025,


    :input-id "583c10bfdbd326ba525bda5d13a0a894b947ffb",


    :weight 0.5088707087900713,


    :bias 0}


   {:error -0.15359996014735702,


    :calculated 0.3095962076691644,


    :input-id "583c10bfdbd326ba525bda5d13a0a894b947ffa",


    :weight 0.38755790024342773,


    :bias 0}


   {:error -0.11659507401745359



    :calculated 0.23938733624830652,


    :input-id "583c10bfdbd326ba525bda5d13a0a894b947ff9",


    :weight 0.2941885012312543,


    :bias 0}


   {:error -0.2784739949663631,


    :calculated 0.6681581686752845,


    :input-id "583c10bfdbd326ba525bda5d13a0a894b947ff8",


    :weight 0.7026355778870271,


    :bias 0}


   {:error -0.36362550327135884,


    :calculated 0.8430641676611533,


    :input-id "583c10bfdbd326ba525bda5d13a0a894b947ff7",


    :weight 0.9174868039523537,


    :bias 0}),


  :id "583c10bfdbd326ba525bda5d13a0a894b947ff6"})



Thanks in advance


Tim Washington
Interruptsoftware.ca



On Mon, Aug 6, 2012 at 7:03 PM, Timothy Washington <twash...@gmail.com>wrote:

> Hey Jim,
>
> Yes, that was actually the first place I was going to post the question.
> But what I noticed, was that the "Financial Neural 
> Network<http://www.heatonresearch.com/neural-network-forums/financial-neural-networks>"
> section, was geared more towards users of financial software, rather than
> software developers actually. But I'm probably being too cautious in that
> respect. Ok, I've moved that post<http://www.heatonresearch.com/node/2716>to 
> the "Financial Neural Network" section.
>
>
> Thanks for the nudge :)
>
> Tim Washington
> Interruptsoftware.ca
>
>
>
> On Mon, Aug 6, 2012 at 12:33 PM, Jim - FooBar(); <jimpil1...@gmail.com>wrote:
>
>>  I'm surprised you didn't post your question on the "Financial Neural
>> Networks" section of the forum but on the "Using encog in Java" instead!
>> You did see it right?
>>
>> Just trying to help here... :-) . I'm genuinely interested in your
>> problem and I'd love to see the solution unfold! In fact if you get it
>> going I'd love to add it in the clojure-encog examples...
>>
>> Jim
>>
>>

-- 
You received this message because you are subscribed to the Google
Groups "Clojure" group.
To post to this group, send email to clojure@googlegroups.com
Note that posts from new members are moderated - please be patient with your 
first post.
To unsubscribe from this group, send email to
clojure+unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/clojure?hl=en

Reply via email to