Hi SK,

I'm working on a PR of adding a logistic regression interface with LBFGS.
It will be merged in Spark 1.1 release, I hope.
https://github.com/apache/spark/pull/1862

Before merging, you can just copy the code into your application to use it.
I'm also working on another PR which automatically rescale the training set
to improve the condition number of the optimization process. After
training, the scaling factors will be integrated back to weights so the
whole thing is transparent to users. Libsvm and glmnet do this to deal with
dataset that has huge variance in some columns.


Sincerely,

DB Tsai
-------------------------------------------------------
My Blog: https://www.dbtsai.com
LinkedIn: https://www.linkedin.com/in/dbtsai


On Mon, Aug 11, 2014 at 2:21 PM, Burak Yavuz <bya...@stanford.edu> wrote:

> Hi,
>
> // Initialize the optimizer using logistic regression as the loss function
> with L2 regularization
> val lbfgs = new LBFGS(new LogisticGradient(), new SquaredL2Updater())
>
> // Set the hyperparameters
>
> lbfgs.setMaxNumIterations(numIterations).setRegParam(regParam).setConvergenceTol(tol).setNumCorrections(numCor)
>
> // Retrieve the weights
> val weightsWithIntercept = lbfgs.optimize(data, initialWeights)
>
> //Slice weights with intercept into weight and intercept
>
> //Initialize Logistic Regression Model
> val model = new LogisticRegressionModel(weights, intercept)
>
> model.predict(test) //Make your predictions
>
> The example code doesn't generate the Logistic Regression Model that you
> can make predictions with.
>
> `LBFGS.runMiniBatchLBFGS` outputs a tuple of (weights, lossHistory). The
> example code was for a benchmark, so it was interested more
> in the loss history than the model itself.
>
> You can also run
> `val (weightsWithIntercept, localLoss) = LBFGS.runMiniBatchLBFGS ...`
>
> slice `weightsWithIntercept` into the intercept and the rest of the
> weights and instantiate the model again as:
> val model = new LogisticRegressionModel(weights, intercept)
>
>
> Burak
>
>
>
> ----- Original Message -----
> From: "SK" <skrishna...@gmail.com>
> To: u...@spark.incubator.apache.org
> Sent: Monday, August 11, 2014 11:52:04 AM
> Subject: Re: [MLLib]:choosing the Loss function
>
> Hi,
>
> Thanks for the reference to the LBFGS optimizer.
> I tried to use the LBFGS optimizer, but I am not able to pass it  as an
> input to the LogisticRegression model for binary classification. After
> studying the code in mllib/classification/LogisticRegression.scala, it
> appears that the  only implementation of LogisticRegression uses
> GradientDescent as a fixed optimizer. In other words, I dont see a
> setOptimizer() function that I can use to change the optimizer to LBFGS.
>
> I tried to follow the code in
>
> https://github.com/dbtsai/spark-lbfgs-benchmark/blob/master/src/main/scala/org/apache/spark/mllib/benchmark/BinaryLogisticRegression.scala
> that makes use of LBFGS, but it is not clear to me where  the
> LogisticRegression  model with LBFGS is being returned that I can use for
> the classification of the test dataset.
>
> If some one has sample code that uses LogisticRegression with LBFGS instead
> of gradientDescent as the optimization algorithm, it would be helpful if
> you
> can post it.
>
> thanks
>
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/MLLib-choosing-the-Loss-function-tp11738p11913.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to