The following might be helpful.

http://community.cloudera.com/t5/Advanced-Analytics-Apache-Spark/What-dependencies-to-submit-Spark-jobs-programmatically-not-via/td-p/24721

http://blog.sequenceiq.com/blog/2014/08/22/spark-submit-in-java/

On 7 April 2015 at 16:32, michal.klo...@gmail.com <michal.klo...@gmail.com>
wrote:

> A SparkContext can submit jobs remotely.
>
> The spark-submit options in general can be populated into a SparkConf and
> passed in when you create a SparkContext.
>
> We personally have not had too much success with yarn-client remote
> submission, but standalone cluster mode was easy to get going.
>
> M
>
>
>
> On Apr 7, 2015, at 7:01 PM, Prashant Kommireddi <prash1...@gmail.com>
> wrote:
>
> Hello folks,
>
> Newbie here! Just had a quick question - is there a job submission API
> such as the one with hadoop
>
> https://hadoop.apache.org/docs/r2.3.0/api/org/apache/hadoop/mapreduce/Job.html#submit()
> to submit Spark jobs to a Yarn cluster? I see in example that
> bin/spark-submit is what's out there, but couldn't find any APIs around it.
>
> Thanks,
> Prashant
>
>


-- 
Regards
vybs

Reply via email to