[
https://issues.apache.org/jira/browse/CALCITE-7214?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=18024024#comment-18024024
]
Rafael Acevedo commented on CALCITE-7214:
-----------------------------------------
>From my understanding/testing, both return null when the cast cannot be
>performed (similar to SAFE_CAST in BigQuery). There can be some edge cases
>though, like format differences when casting a string to timestamp.
What's the current guideline for making a function available in multiple
dialects in regards to edge cases? Do the functions need to be exactly
equivalent?
> Support TRY_CAST in spark dialect
> ---------------------------------
>
> Key: CALCITE-7214
> URL: https://issues.apache.org/jira/browse/CALCITE-7214
> Project: Calcite
> Issue Type: Bug
> Components: core
> Affects Versions: 1.40.0
> Reporter: Rafael Acevedo
> Priority: Minor
>
> In Spark, try_cast is supported (refs:
> [[1|https://github.com/apache/spark/commit/3951e3371a83578a81474ed99fb50d59f27aac62]]
>
> [[2|https://github.com/apache/spark/blob/46ac78ea367cfa9a7acc04482770aaca33f5a575/sql/core/src/test/resources/sql-tests/inputs/try_cast.sql]]
> [3]) but it's not enabled in the spark dialect, only in MSSQL.
> Is there any reason why it's not enabled in the spark dialect? I can work on
> a PR if enabling it makes sense.
--
This message was sent by Atlassian Jira
(v8.20.10#820010)