comphead commented on PR #14392:
URL: https://github.com/apache/datafusion/pull/14392#issuecomment-2628334890

   I'm wondering how the user should choose what function to use. For instance 
`to_timestamp` function which may behave differently between non spark and 
spark env. 
   
   So the developer implements spark compliant `to_timestamp` function in the 
crate but when he is running the query
   ``` select to_timestamp() from t1``` what exactly implementation will be 
picked up? Like we should introduce something like `FUNCTION CATALOG` or 
something similar to switch between implementations?
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: github-unsubscr...@datafusion.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: github-unsubscr...@datafusion.apache.org
For additional commands, e-mail: github-h...@datafusion.apache.org

Reply via email to