Hi, I took your code and ran it on spark 2.4.5 and it works ok for me. My first though, like Sean, is that you have some Spark ML version mismatch somewhere.
Chris > On 17 Aug 2020, at 16:18, Sean Owen <sro...@gmail.com> wrote: > > > Hm, next guess: you need a no-arg constructor this() on FooTransformer? also > consider extending UnaryTransformer. > >> On Mon, Aug 17, 2020 at 9:08 AM Aviad Klein <aviad.kl...@fundbox.com> wrote: >> Hi Owen, it's omitted from what I pasted but I'm using spark 2.4.4 on both. >> >>> On Mon, Aug 17, 2020 at 4:37 PM Sean Owen <sro...@gmail.com> wrote: >>> Looks like you are building vs Spark 3 and running on Spark 2, or something >>> along those lines. >>> >>>> On Mon, Aug 17, 2020 at 4:02 AM Aviad Klein >>>> <aviad.kl...@fundbox.com.invalid> wrote: >>>> Hi, I've referenced the same problem on stack overflow and can't seem to >>>> find answers. >>>> >>>> I have custom spark pipelinestages written in scala that are specific to >>>> my organization. They work well on scala-spark. >>>> >>>> However, when I try to wrap them as shown here, so I can use them in >>>> pyspark, I get weird stuff that's happening. mostly around constructors of >>>> the java objects >>>> >>>> please refer to the stack overflow question, it's the most documented. >>>> >>>> Thanks, any help is appreciated >>>> >>>> -- >>>> Aviad Klein >>>> Director of Data Science >>>> >>>> >> >> >> -- >> Aviad Klein >> Director of Data Science >> >>