hvanhovell commented on code in PR #47572:
URL: https://github.com/apache/spark/pull/47572#discussion_r1705487892


##########
sql/connect/server/src/main/scala/org/apache/spark/sql/connect/planner/SparkConnectPlanner.scala:
##########
@@ -1614,14 +1614,23 @@ class SparkConnectPlanner(
       fun: proto.Expression.UnresolvedFunction): Expression = {
     if (fun.getIsUserDefinedFunction) {
       UnresolvedFunction(
-        parser.parseFunctionIdentifier(fun.getFunctionName),
+        parser.parseMultipartIdentifier(fun.getFunctionName),
         fun.getArgumentsList.asScala.map(transformExpression).toSeq,
         isDistinct = fun.getIsDistinct)
     } else {
+      // Spark Connect historically used the global namespace to lookup a 
couple of internal
+      // functions (e.g. product, collect_top_k, unwrap_udt, ...). In Spark 4 
we moved these
+      // functions to a dedicated namespace, however in order to stay 
backwards compatible we still
+      // need to allow connect to use the global namespace. Here we check if a 
function is
+      // registered in the internal function registry, and we reroute the 
lookup to the internal
+      // registry.
+      val name = fun.getFunctionName
+      val internal = 
FunctionRegistry.internal.functionExists(FunctionIdentifier(name))

Review Comment:
   I don't want to fallback because I don't want these internal functions to 
clash with any UDF the user specifies. If you want an internal function you 
will get an internal function. In a follow-up I want to use a special prefix 
for connect internal functions, and probably add a conf controlling this lookup 
behavior (disabling it for newer clients).



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to