Github user thvasilo commented on a diff in the pull request:

    https://github.com/apache/flink/pull/692#discussion_r30785863
  
    --- Diff: 
flink-staging/flink-ml/src/main/scala/org/apache/flink/ml/optimization/Solver.scala
 ---
    @@ -40,6 +44,32 @@ abstract class Solver extends Serializable with 
WithParameters {
         data: DataSet[LabeledVector],
         initialWeights: Option[DataSet[WeightVector]]): DataSet[WeightVector]
     
    +  /** Creates initial weights vector, creating a DataSet with a 
WeightVector element
    +    *
    +    * @param initialWeights An Option that may contain an initial set of 
weights
    +    * @param data The data for which we optimize the weights
    +    * @return A DataSet containing a single WeightVector element
    +    */
    +  def createInitialWeightsDS(initialWeights: Option[DataSet[WeightVector]],
    +                             data: DataSet[LabeledVector]):  
DataSet[WeightVector] = {
    +    // TODO: Faster way to do this?
    +    val dimensionsDS = data.map(_.vector.size).reduce((a, b) => b)
    +
    +    initialWeights match {
    +      // Ensure provided weight vector is a DenseVector
    +      case Some(wvDS) =>
    +        wvDS.map { wv => {
    +          val denseWeights = wv.weights match {
    +            case dv: DenseVector => dv
    +            case sv: SparseVector => sv.toDenseVector
    +          }
    +          WeightVector(denseWeights, wv.intercept)
    --- End diff --
    
    Not sure what you mean here, is it `WeightVector(denseWeights, 
wv.intercept)` that is off?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

Reply via email to