srielau commented on code in PR #49445: URL: https://github.com/apache/spark/pull/49445#discussion_r1913294290
########## sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/ColumnResolutionHelper.scala: ########## @@ -266,22 +268,40 @@ trait ColumnResolutionHelper extends Logging with DataTypeErrorsBase { } } - if (maybeTempVariableName(nameParts)) { - val variableName = if (conf.caseSensitiveAnalysis) { - nameParts.last - } else { - nameParts.last.toLowerCase(Locale.ROOT) - } - catalogManager.tempVariableManager.get(variableName).map { varDef => + val namePartsCaseAdjusted = if (conf.caseSensitiveAnalysis) { + nameParts + } else { + nameParts.map(_.toLowerCase(Locale.ROOT)) + } + + catalogManager.scriptingLocalVariableManager + // If sessionOnly is set to true lookup only session variables. + .filterNot(_ => sessionVariablesOnly) + // If variable name is qualified with system.session.<varName> treat it as a session variable. + .filterNot(_ => nameParts.take(2).map(_.toLowerCase(Locale.ROOT)) == Seq("system", "session")) Review Comment: SESSION.var1 should be resolved just like any other multipart identifier. If there is no match within the outermost resolution will "spill" first to a stored procedure parameter name, and then to a session variable. You may think of session variables as belonging to a virtual outermost BEGIN wrapping the entire session. https://docs.google.com/document/d/1uFv2VoqDoOH2k6HfgdkBYxp-Ou7Qi721eexNW6IYWwk/edit?tab=t.0#heading=h.2lhf3u4s4p4p It will be hard to forbid session() as a procedure name. So session.var may be a parameter reference (hence the escape hatch of system.session.var) -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org