Set spark.yarn.maxAppAttempts=1 if you don't want retries.

On Thu, Apr 9, 2015 at 10:31 AM, ÐΞ€ρ@Ҝ (๏̯͡๏) <deepuj...@gmail.com> wrote:
> Hello,
> I have a spark job with 5 stages. After it runs 3rd stage, the console shows
>
>
> 15/04/09 10:25:57 INFO yarn.Client: Application report for
> application_1427705526386_127168 (state: RUNNING)
> 15/04/09 10:25:58 INFO yarn.Client: Application report for
> application_1427705526386_127168 (state: RUNNING)
> 15/04/09 10:25:59 INFO yarn.Client: Application report for
> application_1427705526386_127168 (state: ACCEPTED)
> 15/04/09 10:25:59 INFO yarn.Client:
> client token: N/A
> diagnostics: N/A
> ApplicationMaster host: N/A
> ApplicationMaster RPC port: -1
> queue: hdmi-express
> start time: 1428598679223
> final status: UNDEFINED
> tracking URL:
> https://apollo-phx-rm-1.vip.ebay.com:50030/proxy/application_1427705526386_127168/
> user: dvasthimal
> 15/04/09 10:26:00 INFO yarn.Client: Application report for
> application_1427705526386_127168 (state: ACCEPTED)
> 15/04/09 10:26:01 INFO yarn.Client: Application report for
> application_1427705526386_127168 (state: ACCEPTED)
>
> and then running again. This looks as if the stage failed and Spark
> restarted the job from beginning. If thats not the case, when i click the
> spark UI web page, it does not show already completed stages and instead
> goes back to running stage #1. Is there some setting to turn this behavior
> off ?
>
> --
> Deepak
>



-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to