Hi,
Never develop any custom Transformer (or UnaryTransformer in particular),
but I'd be for it if that's the case.
Jacek
28.03.2016 6:54 AM "Maciej Szymkiewicz" napisaĆ(a):
> Hi Jacek,
>
> In this context, don't you think it would be useful, if at least some
> traits from org.apache.spark.ml.p
Hi Jacek,
In this context, don't you think it would be useful, if at least some
traits from org.apache.spark.ml.param.shared.sharedParams were
public?HasInputCol(s) and HasOutputCol for example. These are useful
pretty much every time you create custom Transformer.
--
Pozdrawiam,
Maciej Szymkie
Pingity-ping-pong since this is still a problem.
On Thu, Mar 24, 2016 at 4:08 PM Michael Armbrust
wrote:
> Patrick is investigating.
>
> On Thu, Mar 24, 2016 at 7:25 AM, Nicholas Chammas <
> nicholas.cham...@gmail.com> wrote:
>
>> Just checking in on this again as the builds on S3 are still brok
The warning was added by:
SPARK-12757 Add block-level read/write locks to BlockManager
On Sun, Mar 27, 2016 at 12:24 PM, salexln wrote:
> HI all,
>
> I started testing my code (https://github.com/salexln/FinalProject_FCM)
> with the latest Spark available in GitHub,
> and when I run it I get th
HI all,
I started testing my code (https://github.com/salexln/FinalProject_FCM)
with the latest Spark available in GitHub,
and when I run it I get the following errors:
*scala> val clusters = FuzzyCMeans.train(parsedData, 2, 20, 2.0)*
16/03/27 22:24:10 WARN BlockManager: Block rdd_8_0 already