Hi All,

I configured the number of task failures using spark.task.maxFailures as 10
in my spark application which ingests data into Cassandra reading from
Kafka. I observed that when Cassandra service is down, it is not retrying
for the property I set i.e. 10. Instead it is retrying with the default
maxFailures which is 4. Is there something I need to do, to make Spark retry
to connect to Cassandra more than 4 times?

Thanks in Advance,
Ravi



--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to