[ https://issues.apache.org/jira/browse/FLINK-26401?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
zoucao updated FLINK-26401: --------------------------- Description: For now, many reserved keywords were kept for FlinkSQL[1], and some of them are different from other bigdata systems, like spark[2], hive[3]. In this context, users may be affected by the compatibility of various systems. For example, the keyword 'DATE' is marked as reserved by Flink and Hive, but not in spark. If the hive view is defined as {code:java} select a, b, date from hive_tb where date = '2022-02-28' {code} It will work well in spark, but in flink, the SqlParserException about ' SQL parse failed. Encountered "date" at line xxx, column xxx ' will be thrown. IIUC, If we do nothing to improve the compatibility, users must change the query about the hive view to {code:java} select a, b, `date` from hive_tb where date = '2022-02-28' {code} , to make Flink work. I think we can add a converter to surround reserved keywords with backticks before parsing. [1] https://nightlies.apache.org/flink/flink-docs-master/docs/dev/table/sql/overview/ [2] https://spark.apache.org/docs/3.2.1/sql-ref-ansi-compliance.html#sql-keywords [3] https://cwiki.apache.org/confluence/display/Hive/LanguageManual+DDL#LanguageManualDDL-Keywords,Non-reservedKeywordsandReservedKeywords was: For now, many reserved keywords were kept for FlinkSQL[1], and some of them are different from other bigdata systems, like spark[2], hive[3]. In this context, users may be affected by the compatibility of various systems. For example, the keyword 'DATE' is marked as reserved by Flink and Hive, but not in spark. If the hive view is defined as {code:java} select a, b, date from hive_tb where date = '2022-02-28' {code} It will work well in spark, but in flink, the SqlParserException about ' SQL parse failed. Encountered "date at line xxx, column xxx '. IIUC, If we do nothing to improve the compatibility, users must change the query about the hive view to {code:java} select a, b, `date` from hive_tb where date = '2022-02-28' {code} , to make Flink work. I think we can add a converter to surround reserved keywords with backticks before parsing. [1] https://nightlies.apache.org/flink/flink-docs-master/docs/dev/table/sql/overview/ [2] https://spark.apache.org/docs/3.2.1/sql-ref-ansi-compliance.html#sql-keywords [3] https://cwiki.apache.org/confluence/display/Hive/LanguageManual+DDL#LanguageManualDDL-Keywords,Non-reservedKeywordsandReservedKeywords > improve the compatibility for hive catalogView > ---------------------------------------------- > > Key: FLINK-26401 > URL: https://issues.apache.org/jira/browse/FLINK-26401 > Project: Flink > Issue Type: Improvement > Components: Connectors / Hive > Reporter: zoucao > Priority: Major > > For now, many reserved keywords were kept for FlinkSQL[1], and some of them > are different from other bigdata systems, like spark[2], hive[3]. In this > context, users may be affected by the compatibility of various systems. > For example, the keyword 'DATE' is marked as reserved by Flink and Hive, but > not in spark. If the hive view is defined as > {code:java} > select a, b, date from hive_tb where date = '2022-02-28' > {code} > It will work well in spark, but in flink, the SqlParserException about ' SQL > parse failed. Encountered "date" at line xxx, column xxx ' will be thrown. > IIUC, If we do nothing to improve the compatibility, users must change the > query about the hive view to > {code:java} > select a, b, `date` from hive_tb where date = '2022-02-28' > {code} > , to make Flink work. > I think we can add a converter to surround reserved keywords with backticks > before parsing. > [1] > https://nightlies.apache.org/flink/flink-docs-master/docs/dev/table/sql/overview/ > [2] > https://spark.apache.org/docs/3.2.1/sql-ref-ansi-compliance.html#sql-keywords > [3] > https://cwiki.apache.org/confluence/display/Hive/LanguageManual+DDL#LanguageManualDDL-Keywords,Non-reservedKeywordsandReservedKeywords -- This message was sent by Atlassian Jira (v8.20.1#820001)