Hm, next guess: you need a no-arg constructor this() on FooTransformer?
also consider extending UnaryTransformer.

On Mon, Aug 17, 2020 at 9:08 AM Aviad Klein <aviad.kl...@fundbox.com> wrote:

> Hi Owen, it's omitted from what I pasted but I'm using spark 2.4.4 on both.
>
> On Mon, Aug 17, 2020 at 4:37 PM Sean Owen <sro...@gmail.com> wrote:
>
>> Looks like you are building vs Spark 3 and running on Spark 2, or
>> something along those lines.
>>
>> On Mon, Aug 17, 2020 at 4:02 AM Aviad Klein
>> <aviad.kl...@fundbox.com.invalid> wrote:
>>
>>> Hi, I've referenced the same problem on stack overflow and can't seem to
>>> find answers.
>>>
>>> I have custom spark pipelinestages written in scala that are specific to
>>> my organization. They work well on scala-spark.
>>>
>>> However, when I try to wrap them as shown here, so I can use them in
>>> pyspark, I get weird stuff that's happening. mostly around constructors of
>>> the java objects
>>>
>>> please refer to the stack overflow question
>>> <https://stackoverflow.com/questions/63439162/referencing-a-scala-java-pipelinestage-from-pyspark-constructor-issues-with-ha>,
>>> it's the most documented.
>>>
>>> Thanks, any help is appreciated
>>>
>>> --
>>> *Aviad Klein*
>>> Director of Data Science
>>>
>>>
>>>
>
> --
> *Aviad Klein*
> Director of Data Science
>
>
>

Reply via email to