Hi, Specifically is this a run time or compilation error.
I gather by class path you mean something like below spark-submit --master yarn --deploy-mode client --driver-class-path <full_path_to_custom_jar> --jars ...... HTH LinkedIn * https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw <https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>* *Disclaimer:* Use it at your own risk. Any and all responsibility for any loss, damage or destruction of data or any other property which may arise from relying on this email's technical content is explicitly disclaimed. The author will in no case be liable for any monetary damages arising from such loss, damage or destruction. On Tue, 16 Feb 2021 at 05:23, HARSH TAKKAR <takkarha...@gmail.com> wrote: > Hi , > > I have created a custom Estimator in scala, which i can use successfully > by creating a pipeline model in Java and scala, But when i try to load the > pipeline model saved using scala api in pyspark, i am getting an error > saying module not found. > > I have included my custom model jar in the class pass using "spark.jars" > > Can you please help, if i am missing something. > > Kind Regards > Harsh Takkar >