The following code will allow you to run Logistic Regression using L-BFGS:

val lbfgs = new LBFGS(new LogisticGradient(), new SquaredL2Updater())
lbfgs.setMaxNumIterations(numIterations).setRegParam(regParam).setConvergenceTol(tol).setNumCorrections(numCor)

val weights = lbfgs.optimize(data, initialWeights)

The different loss function support you are asking for is the `new 
LogisticGradient()` part. The different regularizations support
is the `new SquaredL2Updater()`

The supported loss functions are:
1) Logistic - LogisticGradient
2) LeastSquares - LeastSquaresGradient
3) Hinge - HingeGradient

The regularizers are:
0) No regularization - SimpleUpdater
1) L1 regularization - L1Updater
2) L2 regularization - SquaredL2Updater

You can find more here: 
http://spark.apache.org/docs/latest/mllib-linear-methods.html#loss-functions

I would suggest using L-BFGS rather than SGD as it's both much faster and more 
accurate.

Burak

----- Original Message -----
From: "SK" <skrishna...@gmail.com>
To: u...@spark.incubator.apache.org
Sent: Thursday, August 7, 2014 6:31:14 PM
Subject: [MLLib]:choosing the Loss function

Hi,

According to the MLLib guide, there seems to be support for different loss
functions. But I could not find a command line parameter to choose the loss
function but only found regType to choose the regularization. Does MLLib
support a parameter to choose  the loss function?

thanks



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/MLLib-choosing-the-Loss-function-tp11738.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to