That should be fine. The JVM doesn't care how the bytecode it is
executing was produced. As long as you were able to compile it
together - which sometimes means using plugins like scala-maven-plugin
for mixed compilation - the result should be fine.

On Sun, Aug 16, 2020 at 4:28 PM Ramesh Mathikumar
<meetr...@googlemail.com.invalid> wrote:
>
> Hi Team,
>
> A quick question from my side.
>
> Can I use spark-submit which contains both java and scala in a single
> workflow. By single workflow I mean main program is in Java (Wrapped
> in Spark) and it calls a module to calculate something on the payload
> which is in Scala (wrapped in Spark).
>
> Are there any compatibility / interoperability issues around?
>
> Regards,
> Ramster
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to