Looks like you are building vs Spark 3 and running on Spark 2, or something
along those lines.

On Mon, Aug 17, 2020 at 4:02 AM Aviad Klein <aviad.kl...@fundbox.com.invalid>
wrote:

> Hi, I've referenced the same problem on stack overflow and can't seem to
> find answers.
>
> I have custom spark pipelinestages written in scala that are specific to
> my organization. They work well on scala-spark.
>
> However, when I try to wrap them as shown here, so I can use them in
> pyspark, I get weird stuff that's happening. mostly around constructors of
> the java objects
>
> please refer to the stack overflow question
> <https://stackoverflow.com/questions/63439162/referencing-a-scala-java-pipelinestage-from-pyspark-constructor-issues-with-ha>,
> it's the most documented.
>
> Thanks, any help is appreciated
>
> --
> *Aviad Klein*
> Director of Data Science
>
>
>

Reply via email to