Hi,

I think you may want to use this setting?:

spark.task.maxFailures4Number of individual task failures before giving up
on the job. Should be greater than or equal to 1. Number of allowed retries
= this value - 1.

On Thu, May 7, 2015 at 2:34 AM, ÐΞ€ρ@Ҝ (๏̯͡๏) <deepuj...@gmail.com> wrote:

> How i can stop Spark to stop triggering second attempt in case the first
> fails.
> I do not want to wait for the second attempt to fail again so that i can
> debug faster.
>
> .set("spark.yarn.maxAppAttempts", "0") OR .set("spark.yarn.maxAppAttempts",
> "1")
>
> is not helping.
>
> --
> Deepak
>
>

Reply via email to