Hi all,

Is there any way running Spark job in programmatic way on Yarn cluster
without using spark-submit script ?

I cannot include Spark jars on my Java application (due o dependency
conflict and other reasons), so I'll be shipping Spark assembly uber jar
(spark-assembly-1.3.1-hadoop2.3.0.jar) to Yarn cluster, and then execute
job (Python or Java) on Yarn-cluster.

So is there any way running Spark job implemented in python file/Java class
without calling it through spark-submit script ?

Thanks.

Reply via email to