I think the short answer to the question is, no, there is no alternate API
that will not use the System.exit calls. You can craft a workaround like is
being suggested in this thread. For comparison, we are doing programmatic
submission of applications in a long-running client application. To get
around these issues we make a shadowed version of some of the Spark code in
our application to remove the System.exit calls so instead exceptions
bubble up to our application.

On Wed, Jun 3, 2015 at 7:19 AM, Akhil Das <ak...@sigmoidanalytics.com>
wrote:

> Did you try this?
>
> Create an sbt project like:
>
>  // Create your context
>  val sconf = new
> SparkConf().setAppName("Sigmoid").setMaster("spark://sigmoid:7077")
>  val sc = new SparkContext(sconf)
>
>  // Do some computations
>  sc.parallelize(1 to 10000).take(10).foreach(println)
>
>  //Now return the exit status
>  System.exit(Some number)
>
>  Now, make your workflow manager to trigger *sbt run* on the project
> instead of using spark-submit.
>
>
>
> Thanks
> Best Regards
>
> On Wed, Jun 3, 2015 at 2:18 PM, pavan kumar Kolamuri <
> pavan.kolam...@gmail.com> wrote:
>
>> Hi akhil , sorry i may not conveying the question properly .  Actually we
>> are looking to Launch a spark job from a long running workflow manager,
>> which invokes spark client via SparkSubmit. Unfortunately the client upon
>> successful completion of the application exits with a System.exit(0) or
>> System.exit(NON_ZERO) when there is a failure. Question is, Is there an
>> alternate  api though which a spark application can be launched which can
>> return a exit status back to the caller as opposed to initiating JVM halt.
>>
>> On Wed, Jun 3, 2015 at 12:58 PM, Akhil Das <ak...@sigmoidanalytics.com>
>> wrote:
>>
>>> Run it as a standalone application. Create an sbt project and do sbt run?
>>>
>>> Thanks
>>> Best Regards
>>>
>>> On Wed, Jun 3, 2015 at 11:36 AM, pavan kumar Kolamuri <
>>> pavan.kolam...@gmail.com> wrote:
>>>
>>>> Hi guys , i am new to spark . I am using sparksubmit to submit spark
>>>> jobs. But for my use case i don't want it to be exit with System.exit . Is
>>>> there any other spark client which is api friendly other than SparkSubmit
>>>> which shouldn't exit with system.exit. Please correct me if i am missing
>>>> something.
>>>>
>>>> Thanks in advance
>>>>
>>>>
>>>>
>>>>
>>>> --
>>>> Regards
>>>> Pavan Kumar Kolamuri
>>>>
>>>>
>>>
>>
>>
>> --
>> Regards
>> Pavan Kumar Kolamuri
>>
>>
>

Reply via email to