It might be interesting to add a Clojure frontend to Nengo. Having said that, Nengo is very "opinionated" in the sense that you have to buy in to the kind of neural simulation Eliasmith is doing (and the way he's doing it), which Eric may not be interested in doing. There are many other kinds of neural nets worthy of research besides Eliasmith's NEF. Still, it'd be cool to have a Clojure frontend to Nengo.
Carson On Nov 13, 4:09 pm, Ross Gayler <r.gay...@gmail.com> wrote: > You might also consider using your DSL as a frontend to the Nengo > neural simulator (http://nengo.ca). Nengo (which is written in Java) > has recently added a Python > scripting interface (http://www.frontiersin.org/neuroinformatics/ > 10.3389/neuro.11/007.2009/abstract). Nengo has a lot to recommend it > and is pretty mature, so you may save yourself a lot of effort under > the covers - also the way Nengo conceptualises the neyworks might be > useful feedback to your DSL design. > > Ross > > On Nov 14, 5:18 am, "Eric Schulte" <schulte.e...@gmail.com> wrote: > > > Hi Ross, > > > #+begin_src clojure > > (let [n {:phi identity > > :accum (comp (partial reduce +) (partial map *)) > > :weights [2 2 2]}] > > [(repeat 3 n) (repeat 5 n) (assoc n :weights (vec (repeat 5 1)))]) > > #+end_src > > > would result in the following connection pattern > > > [[file:/tmp/layers.png]] > > > layers.png > > 45KViewDownload > > > > However, for other NNs you may care about the topological organisation > > > of the neurons in a 1-D, 2-D, or 3-D space in order to do things like > > > connecting corresponding neurons in different layers or having the > > > probability of a connection be a function of the separation of the > > > neurons. In this case, you might use a data structure representing > > > the coordinates of each neuron as its key. > > > Fully agreed, I'm partway through implementing what you've just > > described (at least as I understand it), in that the library now > > declares a new "Graph" data type which consists of a list of > > keys->Neural mappings as well as a directed edge set. Using this new > > data type it is possible to construct, run and train arbitrarily > > connected graphs of Neural elements. See the fourth example > > athttp://repo.or.cz/w/neural-net.git > > > Best -- Eric > > > > Ross- Hide quoted text - > > > - Show quoted text - > > > Ross Gayler <r.gay...@gmail.com> writes: > > > On Nov 13, 9:12 am, "Eric Schulte" <schulte.e...@gmail.com> wrote: > > >> Albert Cardona <sapri...@gmail.com> writes: > > > >> > Your neural network DSL looks great. One minor comment: why use lists > > >> > instead of sets? ... > > > >> I used lists because I want to be able to specify a network in which (at > > >> least initially) all neurons in a hidden layer are identical e.g. the > > >> list example athttp://cs.unm.edu/~eschulte/src/neural-net/. > > > > You might want to consider maps. > > > Currently I'm using maps to specify a single neuron, and I fear it would > > add complexity to have two different meanings for maps. > > > > For some NN models all you care about is that each neuron has a unique > > > identity (in which case using an index value as a key is as good a > > > solution as any). > > > I'm currently using lists only for fully connected layers in a neural > > network, e.g. the following code -- You received this message because you are subscribed to the Google Groups "Clojure" group. To post to this group, send email to clojure@googlegroups.com Note that posts from new members are moderated - please be patient with your first post. To unsubscribe from this group, send email to clojure+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/clojure?hl=en