I bet you are running on YARN in cluster mode.
If you are running on yarn in client mode,
.set(“spark.yarn.maxAppAttempts”,”1”) works as you expect,
because YARN doesn’t start your app on the cluster until you call
SparkContext().
But If you are running on yarn in cluster mode, the driver progr
Hi,
I think you may want to use this setting?:
spark.task.maxFailures4Number of individual task failures before giving up
on the job. Should be greater than or equal to 1. Number of allowed retries
= this value - 1.
On Thu, May 7, 2015 at 2:34 AM, ÐΞ€ρ@Ҝ (๏̯͡๏) wrote:
> How i can stop Spark to