Hi Oleks,
there is a mode for install neural network from metacello?

2017-04-25 13:00 GMT+02:00 Alexandre Bergel <alexandre.ber...@me.com>:

> Continue to push that topic Oleks. You are on the right track!
>
> Alexandre
>
> > On Apr 24, 2017, at 1:43 AM, Oleks <olk.zayt...@gmail.com> wrote:
> >
> > Hello,
> >
> > Thanks a lot for your advice! It was very helpful and educating (for
> > example, I thought that we store biases in the weight matrix and prepend
> 1
> > to input to make it faster, but now I see why it's actually slower that
> > way).
> >
> > I've implemented a multi-layer neural network as a linked list of layers
> > that propagate the input and error from one to another, similar to the
> Chain
> > of Responsibility pattern. Also, now I represent biases as separate
> vectors.
> > The LearningAlgorithm is a separate class with Backpropagation as its
> > subclass (though at this point the network can only learn through
> > backpropagation, but I'm planning to change that). I'm trying to figure
> out
> > how the activation and cost functions should be connected. For example,
> > cross-entropy works best with logistic sigmoid activation etc. I would
> like
> > to give the user a freedom to use whatever he wants (plug in whatever you
> > like and see what happens), but it can be very inefficient (because some
> > time-consuming parts of activation and cost derivatives cancel out each
> > other).
> >
> > Also, there is an interface for setting the learning rate for the whole
> > network, which can be used to choose the learning rate prior to
> learning, as
> > well as to change the learning rate after each iteration. I am planning
> to
> > implement some optimization algorithms that would automize the process of
> > choosing a learning rate (adagrad for example), but this would require a
> bit
> > different design (maybe I will implement the Optimizer, as you
> suggested).
> >
> > I'm attaching two images with UML diagrams, describing my current
> > implementation. Could you please tell me what you think about this
> design?
> > The first image is a class diagram that shows the whole architecture, and
> > the second one is a sequence diagram of backpropagation.
> >
> > mlnn.png <http://forum.world.st/file/n4943698/mlnn.png>
> > backprop.png <http://forum.world.st/file/n4943698/backprop.png>
> >
> > Sincerely yours,
> > Oleksandr
> >
> >
> >
> > --
> > View this message in context: http://forum.world.st/Neural-
> Networks-in-Pharo-tp4941271p4943698.html
> > Sent from the Pharo Smalltalk Users mailing list archive at Nabble.com.
> >
>
> --
> _,.;:~^~:;._,.;:~^~:;._,.;:~^~:;._,.;:~^~:;._,.;:
> Alexandre Bergel  http://www.bergel.eu
> ^~:;._,.;:~^~:;._,.;:~^~:;._,.;:~^~:;._,.;:~^~:;.
>
>
>
>
>

Reply via email to