cloud-fan commented on code in PR #49445: URL: https://github.com/apache/spark/pull/49445#discussion_r1931463024
########## sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/ResolveCatalogs.scala: ########## @@ -34,11 +37,29 @@ class ResolveCatalogs(val catalogManager: CatalogManager) override def apply(plan: LogicalPlan): LogicalPlan = plan resolveOperatorsDown { // We only support temp variables for now and the system catalog is not properly implemented // yet. We need to resolve `UnresolvedIdentifier` for variable commands specially. - case c @ CreateVariable(UnresolvedIdentifier(nameParts, _), _, _) => - val resolved = resolveVariableName(nameParts) - c.copy(name = resolved) + case c @ CreateVariable(UnresolvedIdentifier(nameParts, _), _, _, _) => Review Comment: What Spark does here is to determine where to create the variable (catalog and namespace), and then turn `UnresolvedIdentifier` into qualified `ResolvedIdentifier`. I think we don't need an extra `sessionVariablesOnly` flag in `CreateVariable`, the qualified `ResolvedIdentifier` can determine everything. - If the variable name is already qualified (`session.var` or `system.session.var`), always fully qualify it to `system.session.var` or fail if the qualifier is not `system.session`. This is because users can create session variables explicitly (via qualified names) anywhere. - If the variable name is unqualified: If we are not in script or we are inside EXECUTE IMMEDIATE, qualify it to `system.session.var`. Otherwise, qualify it to `local.current_scope_label_name.var` We can create a `FakeLocalCatalog` following `FakeSystemCatalog`. In `CreateVariableExec`, if the catalog is `FakeLocalCatalog`, the script local variable manager must be present and we create local variables. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org