Creating a SparkContext and setting master as yarn-cluster unfortunately
will not work.

SPARK-4924 added APIs for doing this in Spark, but won't be included until
1.4.

-Sandy

On Tue, Mar 17, 2015 at 3:19 AM, Akhil Das <ak...@sigmoidanalytics.com>
wrote:

> Create SparkContext set master as yarn-cluster then run it as a standalone
> program?
>
> Thanks
> Best Regards
>
> On Tue, Mar 17, 2015 at 1:27 AM, rrussell25 <rrussel...@gmail.com> wrote:
>
>> Hi, were you ever able to determine a satisfactory approach for this
>> problem?
>> I have a similar situation and would prefer to execute the job directly
>> from
>> java code within my jms listener and/or servlet container.
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/What-is-best-way-to-run-spark-job-in-yarn-cluster-mode-from-java-program-servlet-container-and-NOT-u-tp21817p22086.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
>>
>>
>

Reply via email to