Le 28/12/2012 16:08, Dimitri Pourbaix a écrit : > Luc, > >> However, it is also possible to set a non-diagonal weight matrix, and >> one class (AbstractLeastSquaresOptimizer) performs an eigen dcomposition >> on the matrix to extract its square root. I don't know any use case for >> this, but it has been set up this way, so I guess someone has a use for >> non-diagonal weights. > > Such a situation occurs when observations are correlated. That is > actually the most general expression for a least square problem. > >> I wonder if I should simply add this as is or if we should rather remove >> the non-diagonal weights feature and support only vector weights. > > Even if a vector of weights is convenient, it would only cover a subset > of situations. However, even a vector of weights is not needed if both > the models and the observations are pre-multiplied by the square root > of their weight. By the way, I remind you that those weights already > caused some bugs in the 2.0 release. > > Personnally, I could live with a vector form.
So in order to make sure I understand your point, you would be OK if I deprecate the non-diagonal weights, in which case users needing this would have to implement it themselves by premultiplication (as both you and Konstantin seem to propose)? > > As a more general comment, I find it amazing that all the +1 for the > release were only concerned by the compliance with (commons) rules, > configuration files, ... Just 4 days after the release, you suddenly > figure out that a user is in trouble and you want a quick fix. Maybe > such a test would have been need BEFORE the release! Sure, but for the record the feature was also a last minute change. This was discussed on the list, and the final decision was to add this feature despite the release was close. No wonder we failed to test it thoroughsly. We don't expect our releases to be perfect. We do our best, with the resources we have. > > Regards, > Dim. Le 28/12/2012 16:13, Konstantin Berlin a écrit : > Hi, > > I am not clear on the use of weights in a non-least squares problem > is. In a least-squares problem the weights can simply be taken in by > the user in their input function. So the weights option is a > convenience feature, any non-standard matrix weights could be > programmed in by the user, if they need to. I have a hard time > imagining a case when the weights are non-diagnal. I think your idea > is good and you should go on with it. The quadratic memory scaling > with the number of data points is not acceptable. Thanks Konstantin. So if Dimitri confirm I understood his point correctly, I will proceed with deprecating the Weight class and introduce a WeightVector class, understanding it is a convenience feature suited for simple non-correlated observations. > > -Konstantin best regards, Luc --------------------------------------------------------------------- To unsubscribe, e-mail: dev-unsubscr...@commons.apache.org For additional commands, e-mail: dev-h...@commons.apache.org