Here is the JIRA and PR for supporting PolynomialExpansion with degree 1,
and it has been merged.

https://issues.apache.org/jira/browse/SPARK-13338
https://github.com/apache/spark/pull/11216

2016-05-02 9:20 GMT-07:00 Nick Pentreath <nick.pentre...@gmail.com>:

> There is a JIRA and PR around for supporting polynomial expansion with
> degree 1. Offhand I can't recall if it's been merged
> On Mon, 2 May 2016 at 17:45, Julio Antonio Soto de Vicente <ju...@esbet.es>
> wrote:
>
>> Hi,
>>
>> Same goes for the PolynomialExpansion in org.apache.spark.ml.feature. It
>> would be dice to cross-validate with degree 1 polynomial expansion (this
>> is, with no expansion at all) vs other degree polynomial expansions.
>> Unfortunately, degree is forced to be >= 2.
>>
>> --
>> Julio
>>
>> > El 2 may 2016, a las 9:05, Rahul Tanwani <tanwanira...@gmail.com>
>> escribió:
>> >
>> > Hi,
>> >
>> > In certain cases (mostly due to time constraints), we need some model
>> to run
>> > without cross validation. In such a case, since k-fold value for cross
>> > validator cannot be one, we have to maintain two different code paths to
>> > achieve both the scenarios (with and without cross validation).
>> >
>> > Would it be an okay idea to generalize the cross validator so it can
>> work
>> > with k-fold value of 1? The only purpose for this is to avoid
>> maintaining
>> > two different code paths and in functionality it should be similar to
>> as if
>> > the cross validation is not present.
>> >
>> >
>> >
>> >
>> >
>> > --
>> > View this message in context:
>> http://apache-spark-developers-list.1001551.n3.nabble.com/Cross-Validator-to-work-with-K-Fold-value-of-1-tp17404.html
>> > Sent from the Apache Spark Developers List mailing list archive at
>> Nabble.com.
>> >
>> > ---------------------------------------------------------------------
>> > To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
>> > For additional commands, e-mail: dev-h...@spark.apache.org
>> >
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
>> For additional commands, e-mail: dev-h...@spark.apache.org
>>
>>

Reply via email to