ssible reason do they have to think its fragmentation?
>>
>>
>>
>> *From:* janardhan shetty [mailto:janardhan...@gmail.com]
>> *Sent:* Saturday, August 06, 2016 2:01 PM
>> *To:* Ted Yu
>> *Cc:* user
>> *Subject:* Re: Symbol HasInputCol is inaccesible from
t; *Cc:* user
> *Subject:* Re: Symbol HasInputCol is inaccesible from this place
>
>
>
> Yes seems like, wondering if this can be made public in order to develop
> custom transformers or any other alternatives ?
>
>
>
> On Sat, Aug 6, 2016 at 10:07 AM, Ted Yu wrote:
>
What possible reason do they have to think its fragmentation?
From: janardhan shetty [mailto:janardhan...@gmail.com]
Sent: Saturday, August 06, 2016 2:01 PM
To: Ted Yu
Cc: user
Subject: Re: Symbol HasInputCol is inaccesible from this place
Yes seems like, wondering if this can be made public in
I searched *Suite.scala and found only the following contains some classes
extending Transformer :
./mllib/src/test/scala/org/apache/spark/ml/PipelineSuite.scala
But HasInputCol is not used.
FYI
On Sat, Aug 6, 2016 at 11:01 AM, janardhan shetty
wrote:
> Yes seems like, wondering if this can b
Yes seems like, wondering if this can be made public in order to develop
custom transformers or any other alternatives ?
On Sat, Aug 6, 2016 at 10:07 AM, Ted Yu wrote:
> Is it because HasInputCol is private ?
>
> private[ml] trait HasInputCol extends Params {
>
> On Thu, Aug 4, 2016 at 1:18 PM,
Is it because HasInputCol is private ?
private[ml] trait HasInputCol extends Params {
On Thu, Aug 4, 2016 at 1:18 PM, janardhan shetty
wrote:
> Version : 2.0.0-preview
>
> import org.apache.spark.ml.param._
> import org.apache.spark.ml.param.shared.{HasInputCol, HasOutputCol}
>
>
> class Custom
Any thoughts or suggestions on this error?
On Thu, Aug 4, 2016 at 1:18 PM, janardhan shetty
wrote:
> Version : 2.0.0-preview
>
> import org.apache.spark.ml.param._
> import org.apache.spark.ml.param.shared.{HasInputCol, HasOutputCol}
>
>
> class CustomTransformer(override val uid: String) extend
Version : 2.0.0-preview
import org.apache.spark.ml.param._
import org.apache.spark.ml.param.shared.{HasInputCol, HasOutputCol}
class CustomTransformer(override val uid: String) extends Transformer with
HasInputCol with HasOutputCol with DefaultParamsWritableimport
org.apache.spark.ml.param.share