dusantism-db commented on code in PR #49445: URL: https://github.com/apache/spark/pull/49445#discussion_r1913441997
########## sql/core/src/main/scala/org/apache/spark/sql/SparkSession.scala: ########## @@ -445,28 +445,29 @@ class SparkSession private( private def executeSqlScript( script: CompoundBody, args: Map[String, Expression] = Map.empty): DataFrame = { - val sse = new SqlScriptingExecution(script, this, args) - var result: Option[Seq[Row]] = None - - // We must execute returned df before calling sse.getNextResult again because sse.hasNext - // advances the script execution and executes all statements until the next result. We must - // collect results immediately to maintain execution order. - // This ensures we respect the contract of SqlScriptingExecution API. - var df: Option[DataFrame] = sse.getNextResult - while (df.isDefined) { - sse.withErrorHandling { - // Collect results from the current DataFrame. - result = Some(df.get.collect().toSeq) + Utils.tryWithResource(new SqlScriptingExecution(script, this, args)) { sse => Review Comment: Yes, it is used for cleaning up the variable manager and it's context when script is finished. If you mean do we need exactly Utils.tryWithResource and extending Closeable, no, we could just use try .. finally blocks instead. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org