guixiaowen commented on PR #46620:
URL: https://github.com/apache/spark/pull/46620#issuecomment-2127256543

   > > I am wondering if we can leverage `SparkException.errorClass` instead - 
since `SparkException` is thrown by Spark ? Return `EXIT_STOP_AM_RETRY` for 
some specific subset of error classes ?
   > 
   > Agree, this can be used to handle existing exception, Maybe it's a good 
idea to include several highest frequencies error classes in your production 
environment While `SparkStopAMRetryException` can be used to handle error 
scenarios afterwards this pr +CC @LuciferYang
   
   @mridulm  @summaryzb  
   
   Thank you both for helping me review this pr.
   
   In fact, I was initially thinking about whether I could use Spark's existing 
exception classes to achieve reuse.
   
   But if I don't use new exception information, it may not work for me to do 
this place. Because in the yarn-cluster mode, the application master determines 
whether a retry is needed based on the current exception information, such as:
   
    e.getCause match {
                 case _: InterruptedException =>
                 case e: SparkUserAppException(exitCode) =>
            ##     e.getMessage container message ("Table or view not found:") 
or highest frequencies error message
   
   this code is in ApplicationMaster.
   
   
   But if the user throws their own defined exception information,such as:
   
   throw new MyTestExecption("this is a test exception, I want to stop am 
retry.")
   
   In ApplicationMaster, unable to capture user-defined exception information.
   
   
   
   
   
   
   
   
   
   
   
   
   
   
   
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to