Hi,
While reviewing ClassificationModel custom implementations, I found
that out of 4 "production" models two are final while the other two
are not. Is there any reason for this?
** `DecisionTreeClassificationModel` (`final`)
** `RandomForestClassificationModel` (`final`)
** `LogisticRegressionMo
Can you see if this is the patch that caused the issue?
https://github.com/apache/spark/pull/11737
On Wed, Mar 23, 2016 at 2:20 PM, Koert Kuipers wrote:
> one of our unit tests broke with changes in spark 2.0 snapshot in last few
> days (or maybe i simple missed it longer). i think it boils d
one of our unit tests broke with changes in spark 2.0 snapshot in last few
days (or maybe i simple missed it longer). i think it boils down to this:
val df1 = sc.makeRDD(1 to 3).toDF
val df2 = df1.map(row => Row(row(0).asInstanceOf[Int] +
1))(RowEncoder(df1.schema))
println(s"schema before ${df1.s
Added to
https://cwiki.apache.org/confluence/display/SPARK/Supplemental+Spark+Projects
I'm not clear what our criteria are for being added as an org, vs
project, vs posting projects on spark-packages.org. Should this page
actually go away in favor of spark-packages.org?
The wiki doesn't do harm,