#+begin_src clojure
(let [n {:phi identity
:accum (comp (partial reduce +) (partial map *))
:weights [2 2 2]}]
[(repeat 3 n) (repeat 5 n) (assoc n :weights (vec (repeat 5 1)))])
#+end_srcwould result in the following connection pattern [[file:/tmp/layers.png]]
> However, for other NNs you may care about the topological organisation > of the neurons in a 1-D, 2-D, or 3-D space in order to do things like > connecting corresponding neurons in different layers or having the > probability of a connection be a function of the separation of the > neurons. In this case, you might use a data structure representing > the coordinates of each neuron as its key. > Fully agreed, I'm partway through implementing what you've just described (at least as I understand it), in that the library now declares a new "Graph" data type which consists of a list of keys->Neural mappings as well as a directed edge set. Using this new data type it is possible to construct, run and train arbitrarily connected graphs of Neural elements. See the fourth example at http://repo.or.cz/w/neural-net.git Best -- Eric > > Ross
<<attachment: layers.png>>
-- You received this message because you are subscribed to the Google Groups "Clojure" group. To post to this group, send email to [email protected] Note that posts from new members are moderated - please be patient with your first post. To unsubscribe from this group, send email to [email protected] For more options, visit this group at http://groups.google.com/group/clojure?hl=en
