On 9/7/12 2:22 PM, Ted Dunning wrote:
> This is great.
>
> A very useful feature would be to allow basic L_1 and L_2 regularization.
>
> This makes it much easier to avoid problems with separable problems.
>
> It might be interesting to think for a moment how easy it would be to
> support generalized linear regression in this same package.  Small changes
> to the loss function in the optimization should allow you to have not just
> logistic and probit regression, but also to get Poisson regression and SVM
> in the same framework.

+1
Patches welcome!

Phil
>
> On Fri, Sep 7, 2012 at 3:22 AM, marios michaelidis 
> <mimari...@hotmail.com>wrote:
>
>> I am willing to provide complete
>> Logistic and Probit regression algorithms, optimizable by newton Raphson
>> optimization maximum-likelihood method , in a very programmatically easy
>> way
>> (e.g  regression(double matrix [][],  double Target[], String
>> Constant, double precision, double tolerance) , with academic references
>> and
>> very quick (3 secs for 60k set), with getter methods for all the common
>> statistics such as null Deviance, Deviance, AIC, BIC, Chi-square f the
>> model,
>> betas, Wald statistics and p values, Cox_snell R square, Nagelkerke’s
>> R-Square,
>> Pseudo_r2, residuals, probabilities, classification matrix.
>>


---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@commons.apache.org
For additional commands, e-mail: dev-h...@commons.apache.org

Reply via email to