On Apr 3, 5:42 pm, "TG" <[EMAIL PROTECTED]> wrote: > Hi there. > > Reading the page on python performance (http://scipy.org/PerformancePython > ) made me realize that I can achieve tremendous code acceleration with > numpy just by using "u[:,:]" kind of syntax the clever way. > > Here is a little problem (Oja's rule of synaptic plasticity) > > * W is a matrix containing the weights of connections between elements > i > and j > * V is an array containing the values of elements > > I want to make W evolve with this rule : > > dW[i,j] / dt = alpha * (V[i] * V[j] - W[i,j] * V[i]^2) > > (don't pay attention to the derivate and stuff) > > So, how would you write it in this nifty clever way ? > > As a begining I wrote this : > > W += V.flatten().reshape((V.size,1)) * > V.flatten().reshape((1,V.size)) > > But it is not complete and, I guess, not efficient.
alpha * (V[i] * V[j] - W[i,j] * V[i]^2) = alpha * V[i] * (V[j] - W[i,j] * V[i]) Assuming that V is a column vector, you could do it like this: V = array([[5],[3],[7]]) W = array([[1,5,3],[2,2,7],[3,9,8]]) W += alpha * V * (transpose(V) - W * V) -- http://mail.python.org/mailman/listinfo/python-list