mridulm commented on PR #46620:
URL: https://github.com/apache/spark/pull/46620#issuecomment-2126341592

   While this PR does not include it, in order to leverage the change 
introduced for `SparkStopAMRetryException` - existing exception handling will 
need to be changed (to throw `SparkStopAMRetryException` instead of whatever is 
being thrown now) - which will be a backwardly incompatible change.
   
   
   I am wondering if we can leverage `SparkException.errorClass` instead - 
since `SparkException` is thrown by Spark ? Return `EXIT_STOP_AM_RETRY` for 
some specific subset of error classes ?
   
   +CC @MaxGekk in case this idea makes sense !


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to