cloud-fan commented on code in PR #52173: URL: https://github.com/apache/spark/pull/52173#discussion_r2312759453
########## sql/api/src/main/scala/org/apache/spark/sql/SparkSession.scala: ########## @@ -523,6 +523,24 @@ abstract class SparkSession extends Serializable with Closeable { sql(sqlText, args.asScala.toMap) } + /** + * Executes a SQL query substituting parameters by the given arguments with optional names, + * returning the result as a `DataFrame`. This API eagerly runs DDL/DML commands, but not for + * SELECT queries. This method allows the inner query to determine whether to use positional + * or named parameters based on its parameter markers. + * + * @param sqlText + * A SQL statement with named or positional parameters to execute. + * @param args + * An array of Java/Scala objects that can be converted to SQL literal expressions. + * @param paramNames + * An optional array of parameter names corresponding to args. If provided, enables named + * parameter binding where parameter names are available. If None or shorter than args, + * remaining parameters are treated as positional. Review Comment: I'm a bit confused here. As of today, we require the parameter markers and the supplied parameter values to match 1. if the query uses named parameter markers, e.g. `:name`, then the supplied parameter values must be a Map. 2. if the quert uses positional parameter markers, e.g. `?`, then the supplied parameter values must be an Array. What is the change here? The supplied parameter values are two arrays so that we don't need to care about if the parameter markers are named or positional? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org