iffyio commented on code in PR #1498:
URL: 
https://github.com/apache/datafusion-sqlparser-rs/pull/1498#discussion_r1831399100


##########
tests/sqlparser_mssql.rs:
##########
@@ -192,6 +192,35 @@ fn parse_mssql_apply_join() {
     );
 }
 
+#[test]
+fn parse_mssql_cross_apply_json() {

Review Comment:
   Could we add one scenario that checks the returned AST (i.e that we get a 
TableFactor::OpenJSON containing the intended components of the input)?



##########
tests/sqlparser_mssql.rs:
##########
@@ -192,6 +192,35 @@ fn parse_mssql_apply_join() {
     );
 }
 
+#[test]
+fn parse_mssql_cross_apply_json() {
+    let _ = ms().verified_only_select(
+        "SELECT B.kind, B.id_list \
+        FROM t_test_table AS A \
+        CROSS APPLY OPENJSON(A.param, '$.config') WITH (kind VARCHAR(20) 
'$.kind', [id_list] NVARCHAR(MAX) '$.id_list' AS JSON) AS B",
+    );

Review Comment:
   ```suggestion
       ms_and_generic().verified_only_select(
           "SELECT B.kind, B.id_list \
           FROM t_test_table AS A \
           CROSS APPLY OPENJSON(A.param, '$.config') WITH (kind VARCHAR(20) 
'$.kind', [id_list] NVARCHAR(MAX) '$.id_list' AS JSON) AS B",
       );
   ```



##########
src/parser/mod.rs:
##########
@@ -10000,7 +10000,7 @@ impl<'a> Parser<'a> {
                     table_with_joins: Box::new(table_and_joins),
                     alias,
                 })
-            } else if dialect_of!(self is SnowflakeDialect | GenericDialect) {
+            } else if dialect_of!(self is SnowflakeDialect | GenericDialect | 
MsSqlDialect) {

Review Comment:
   Yeah to @lovasoa's suggestion, one double check on my end @gaoqiangz is the 
parenthesis around the joined table supported syntax for the openjson syntax? 
Thinking if its not, then we would be able to skip this change entirely. If it 
is then indeed lets update the comments to account for mssql.
   Additionally, can we add a test case covering the behavior (as in a 
roundtrip test like `SELECT ... FROM T AS A CROSS APPLY (OPENJSON(...) WITH 
...) AS B` so that we catch any regression on the behavior going forward



##########
src/parser/mod.rs:
##########
@@ -10133,6 +10134,30 @@ impl<'a> Parser<'a> {
                 columns,
                 alias,
             })
+        } else if self.parse_keyword_with_tokens(Keyword::OPENJSON, 
&[Token::LParen]) {
+            let json_expr = self.parse_expr()?;
+            let json_path = if self.consume_token(&Token::Comma) {
+                Some(self.parse_value()?)
+            } else {
+                None
+            };
+            self.expect_token(&Token::RParen)?;
+            let columns = if self.parse_keyword(Keyword::WITH) {
+                self.expect_token(&Token::LParen)?;
+                let columns =
+                    
self.parse_comma_separated(Parser::parse_openjson_table_column_def)?;
+                self.expect_token(&Token::RParen)?;
+                columns
+            } else {
+                Vec::new()
+            };
+            let alias = 
self.parse_optional_table_alias(keywords::RESERVED_FOR_TABLE_ALIAS)?;
+            Ok(TableFactor::OpenJsonTable {
+                json_expr,
+                json_path,
+                columns,
+                alias,
+            })

Review Comment:
   One request (I think the other branches in this function don't do a good job 
of it :) could we move this to its own function e.g 
`self.parse_open_json_table_factor()` and call that function from this branch 
instead? thinking in order to keep the branches shorter/more-focused 



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: github-unsubscr...@datafusion.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: github-unsubscr...@datafusion.apache.org
For additional commands, e-mail: github-h...@datafusion.apache.org

Reply via email to