An example of being able to do this is provided in the Spark Jetty Server
project [1]

[1] https://github.com/calrissian/spark-jetty-server

On Wed, Jun 17, 2015 at 8:29 PM, Elkhan Dadashov <elkhan8...@gmail.com>
wrote:

> Hi all,
>
> Is there any way running Spark job in programmatic way on Yarn cluster
> without using spark-submit script ?
>
> I cannot include Spark jars on my Java application (due o dependency
> conflict and other reasons), so I'll be shipping Spark assembly uber jar
> (spark-assembly-1.3.1-hadoop2.3.0.jar) to Yarn cluster, and then execute
> job (Python or Java) on Yarn-cluster.
>
> So is there any way running Spark job implemented in python file/Java
> class without calling it through spark-submit script ?
>
> Thanks.
>
>
>

Reply via email to