Hi all,

I found that task retries are currently not 
supported<https://github.com/apache/spark/blob/5264164a67df498b73facae207eda12ee133be7d/sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/continuous/ContinuousTaskRetryException.scala>
 in continuous processing mode. Is there another way to recover from continuous 
task failures currently? If not, are there plans to support this in a future 
release?
Thanks,
Basil

Reply via email to