Hi there. Reading the page on python performance ( http://scipy.org/PerformancePython ) made me realize that I can achieve tremendous code acceleration with numpy just by using "u[:,:]" kind of syntax the clever way.
Here is a little problem (Oja's rule of synaptic plasticity) * W is a matrix containing the weights of connections between elements i and j * V is an array containing the values of elements I want to make W evolve with this rule : dW[i,j] / dt = alpha * (V[i] * V[j] - W[i,j] * V[i]^2) (don't pay attention to the derivate and stuff) So, how would you write it in this nifty clever way ? As a begining I wrote this : W += V.flatten().reshape((V.size,1)) * V.flatten().reshape((1,V.size)) But it is not complete and, I guess, not efficient. -- http://mail.python.org/mailman/listinfo/python-list