miland-db commented on code in PR #49427:
URL: https://github.com/apache/spark/pull/49427#discussion_r1920180208


##########
sql/core/src/main/scala/org/apache/spark/sql/scripting/SqlScriptingInterpreter.scala:
##########
@@ -63,6 +67,79 @@ case class SqlScriptingInterpreter(session: SparkSession) {
       case _ => None
     }
 
+  /**
+   * Transform [[CompoundBody]] into [[CompoundBodyExec]].
+ *
+   * @param compoundBody
+   *   CompoundBody to be transformed into CompoundBodyExec.
+   * @param args
+   *   A map of parameter names to SQL literal expressions.
+   * @param isHandler
+   *   Indicates if the body is a handler body to do additional processing 
during transforming.
+   * @return
+   *   Executable version of the CompoundBody .
+   */
+  private def transformBodyIntoExec(
+      compoundBody: CompoundBody,
+      args: Map[String, Expression],
+      context: SqlScriptingExecutionContext): CompoundBodyExec = {
+    // Add drop variables to the end of the body.
+    val variables = compoundBody.collection.flatMap {
+      case st: SingleStatement => getDeclareVarNameFromPlan(st.parsedPlan)
+      case _ => None
+    }
+    val dropVariables = variables
+      .map(varName => DropVariable(varName, ifExists = true))
+      .map(new SingleStatementExec(_, Origin(), args, isInternal = true, 
context))
+      .reverse
+
+    // Create a map of conditions (SqlStates) to their respective handlers.
+    val conditionHandlerMap = HashMap[String, ErrorHandlerExec]()
+    compoundBody.handlers.foreach(handler => {
+      val handlerBodyExec =
+        transformBodyIntoExec(
+          handler.body,
+          args,
+          context)
+
+      // Execution node of handler.
+      val handlerScopeLabel = if (handler.handlerType == HandlerType.EXIT) {
+        Some(compoundBody.label.get)
+      } else {
+        None
+      }
+
+      val handlerExec = new ErrorHandlerExec(
+        handlerBodyExec,
+        handler.handlerType,
+        handlerScopeLabel)
+
+      // For each condition handler is defined for, add corresponding key 
value pair
+      // to the conditionHandlerMap.
+      handler.conditions.foreach(condition => {
+        // Condition can either be the key in conditions map or SqlState.

Review Comment:
   That is a bit hard to achieve. It is allowed to have `DECLARE KP042 
CONDITION FOR SQLSTATE '31000'`. `KP042` would also pass tests for SQLSTATE.
   
   What this part of the code does is for each 
condition/sqlstate/sqlexception/not found that is in codition values list:
    `DECLARE EXIT HANDLER FOR SQLSTATE '22012', KP042, DIVIDE_BY_ZERO BEGIN ... 
END;`
   adds a field to the map which is string key -> handler. If you add both 
CONDITION and SQL state in handler FOR list, error will be thrown because of 
duplicate value in `visitConditionValues(ctx: ConditionValuesContext)` in 
`AstBuilder`.
   
   Does this behavior make sense, or should we do it differently?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to