Re: MLlib - logistic regression with GD vs LBFGS, sparse vs dense benchmark result

2014-04-27 Thread DB Tsai
Hi David, I'm recording the loss history in the DiffFunction implementation, and that's why the rejected step is also recorded in my loss history. Is there any api in Breeze LBFGS to get the history which already excludes the reject step? Or should I just call "iterations" method and check "itera

Re: MLlib - logistic regression with GD vs LBFGS, sparse vs dense benchmark result

2014-04-27 Thread DB Tsai
Also, how many failure of rejection will terminate the optimization process? How is it related to "numberOfImprovementFailures"? Thanks. Sincerely, DB Tsai --- My Blog: https://www.dbtsai.com LinkedIn: https://www.linkedin.com/in/dbtsai On S

Re: MLlib - logistic regression with GD vs LBFGS, sparse vs dense benchmark result

2014-04-27 Thread DB Tsai
I think I figure it out. Instead of calling minimize, and record the loss in the DiffFunction, I should do the following. val states = lbfgs.iterations(new CachedDiffFunction(costFun), initialWeights.toBreeze.toDenseVector) states.foreach(state => lossHistory.append(state.value)) All the losses i