Re: [PR] chore: Add asf.yaml [datafusion-sqlparser-rs]

2024-10-09 Thread via GitHub


alamb merged PR #1463:
URL: https://github.com/apache/datafusion-sqlparser-rs/pull/1463


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: dev-unsubscr...@datafusion.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: dev-unsubscr...@datafusion.apache.org
For additional commands, e-mail: dev-h...@datafusion.apache.org



Re: [PR] chore: Add asf.yaml [datafusion-sqlparser-rs]

2024-10-09 Thread via GitHub


Xuanwo commented on code in PR #1463:
URL: 
https://github.com/apache/datafusion-sqlparser-rs/pull/1463#discussion_r1793693500


##
.asf.yaml:
##
@@ -0,0 +1,42 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+# This file controls the settings of this repository
+#
+# See more details at
+# https://cwiki.apache.org/confluence/display/INFRA/Git+-+.asf.yaml+features
+
+notifications:
+  commits: comm...@datafusion.apache.org
+  issues: git...@datafusion.apache.org
+  pullrequests: git...@datafusion.apache.org
+github:
+  description: "Extensible SQL Lexer and Parser for Rust"
+  labels:
+- big-data
+- rust
+- sql
+  enabled_merge_buttons:
+squash: true
+merge: false
+rebase: false
+  features:
+issues: true
+  protected_branches:
+main:
+  required_pull_request_reviews:
+required_approving_review_count: 1

Review Comment:
   Makes sense.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: dev-unsubscr...@datafusion.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: dev-unsubscr...@datafusion.apache.org
For additional commands, e-mail: dev-h...@datafusion.apache.org



Proposal to trim down Ballista to a small library

2024-10-09 Thread Andy Grove
There is an epic PR/proposal [1] that I would like everyone to be aware of.

The PR has a lot of detail to explain the motivation, but the TL;DR is that
this PR trims Ballista down to be a library that people can extend rather
than an out-of-the-box application.

Copied from the PR description: This PR removes:

   - python support (datafusion python is the main effort, we should focus
   making it run on ballista)
   - helm support (this can be contrib project)
   - good part of ci directory (ci works without it)
   - plugin sub-system (never finalized)
   - caching support (this can be a contrib project)
   - support for hdfs (library too old, there are newer library to be used)
   - UI (can be a contrib project)
   - made keda, rest-api, flight-sql optional (we could move them to
   contrib project)
   - key value store is gone, only memory store remains
   - removes some dependencies which were detected as not used

Personally, I like this direction, although I have not yet reviewed the PR.

Please leave comments on the PR if you have opinions.

Thanks,

Andy.

[1] https://github.com/apache/datafusion-ballista/pull/1066


Re: [PR] Implement `Spanned` to retrieve source locations on AST nodes [datafusion-sqlparser-rs]

2024-10-09 Thread via GitHub


Nyrox commented on code in PR #1435:
URL: 
https://github.com/apache/datafusion-sqlparser-rs/pull/1435#discussion_r1793097747


##
src/ast/spans.rs:
##
@@ -0,0 +1,1768 @@
+use core::iter;
+
+use crate::tokenizer::Span;
+
+use super::{
+AlterColumnOperation, AlterIndexOperation, AlterTableOperation, Array, 
Assignment,
+AssignmentTarget, CloseCursor, ClusteredIndex, ColumnDef, ColumnOption, 
ColumnOptionDef,
+ConflictTarget, ConstraintCharacteristics, CopySource, CreateIndex, 
CreateTable,
+CreateTableOptions, Cte, Delete, DoUpdate, ExceptSelectItem, 
ExcludeSelectItem, Expr,
+ExprWithAlias, FromTable, Function, FunctionArg, FunctionArgExpr, 
FunctionArgumentClause,
+FunctionArgumentList, FunctionArguments, GroupByExpr, HavingBound, 
IlikeSelectItem, Insert,
+Interpolate, InterpolateExpr, Join, JoinConstraint, JoinOperator, 
JsonPath, JsonPathElem,
+MatchRecognizePattern, Measure, ObjectName, OnConflict, OnConflictAction, 
OnInsert, OrderBy,
+OrderByExpr, Partition, PivotValueSource, ProjectionSelect, Query, 
ReferentialAction,
+RenameSelectItem, ReplaceSelectElement, ReplaceSelectItem, Select, 
SelectItem, SetExpr,
+SqlOption, Statement, Subscript, SymbolDefinition, TableAlias, 
TableConstraint, TableFactor,
+TableOptionsClustered, TableWithJoins, Use, Value, Values, ViewColumnDef,
+WildcardAdditionalOptions, With, WithFill,
+};
+
+/// A trait for AST nodes that have a source span for use in diagnostics.
+///
+/// Source spans are not guaranteed to be entirely accurate. They may
+/// be missing keywords or other tokens. Some nodes may not have a computable
+/// span at all, in which case they return `Span::empty()`.
+///
+/// Some impl blocks may contain doc comments with information
+/// on which nodes are missing spans.
+pub trait Spanned {
+/// Compute the source span for this AST node, by recursively
+/// combining the spans of its children.
+fn span(&self) -> Span;
+}
+
+impl Spanned for Query {
+fn span(&self) -> Span {
+self.body
+.span()
+.union_opt(&self.with.as_ref().map(|i| i.span()))
+}
+}
+
+impl Spanned for With {
+fn span(&self) -> Span {
+union_spans(
+core::iter::once(self.with_token.span)
+.chain(self.cte_tables.iter().map(|item| item.span())),
+)
+}
+}
+
+impl Spanned for Cte {
+fn span(&self) -> Span {
+union_spans(
+core::iter::once(self.alias.span())
+.chain(core::iter::once(self.query.span()))
+.chain(self.from.iter().map(|item| item.span))
+.chain(core::iter::once(self.closing_paren_token.span)),
+)
+}
+}
+
+/// # partial span
+///
+/// [SetExpr::Table] is not implemented.
+impl Spanned for SetExpr {
+fn span(&self) -> Span {
+match self {
+SetExpr::Select(select) => select.span(),
+SetExpr::Query(query) => query.span(),
+SetExpr::SetOperation {
+op: _,
+set_quantifier: _,
+left,
+right,
+} => left.span().union(&right.span()),
+SetExpr::Values(values) => values.span(),
+SetExpr::Insert(statement) => statement.span(),
+SetExpr::Table(_) => Span::empty(),
+SetExpr::Update(statement) => statement.span(),
+}
+}
+}
+
+impl Spanned for Values {
+fn span(&self) -> Span {
+union_spans(
+self.rows
+.iter()
+.map(|row| union_spans(row.iter().map(|expr| expr.span(,
+)
+}
+}
+
+/// # partial span
+///
+/// Missing spans:
+/// - [Statement::CopyIntoSnowflake]
+/// - [Statement::CreateSecret]
+/// - [Statement::CreateRole]
+/// - [Statement::AlterRole]
+/// - [Statement::AttachDatabase]
+/// - [Statement::AttachDuckDBDatabase]
+/// - [Statement::DetachDuckDBDatabase]
+/// - [Statement::Drop]
+/// - [Statement::DropFunction]
+/// - [Statement::DropProcedure]
+/// - [Statement::DropSecret]
+/// - [Statement::Declare]
+/// - [Statement::CreateExtension]
+/// - [Statement::Fetch]
+/// - [Statement::Flush]
+/// - [Statement::Discard]
+/// - [Statement::SetRole]
+/// - [Statement::SetVariable]
+/// - [Statement::SetTimeZone]
+/// - [Statement::SetNames]
+/// - [Statement::SetNamesDefault]
+/// - [Statement::ShowFunctions]
+/// - [Statement::ShowVariable]
+/// - [Statement::ShowStatus]
+/// - [Statement::ShowVariables]
+/// - [Statement::ShowCreate]
+/// - [Statement::ShowColumns]
+/// - [Statement::ShowTables]
+/// - [Statement::ShowCollation]
+/// - [Statement::StartTransaction]
+/// - [Statement::SetTransaction]
+/// - [Statement::Comment]
+/// - [Statement::Commit]
+/// - [Statement::Rollback]
+/// - [Statement::CreateSchema]
+/// - [Statement::CreateDatabase]
+/// - [Statement::CreateFunction]
+/// - [Statement::CreateTrigger]
+/// - [Statement::DropTrigger]
+/// - [Statement::CreateProcedure]

Re: [PR] Implement `Spanned` to retrieve source locations on AST nodes [datafusion-sqlparser-rs]

2024-10-09 Thread via GitHub


Nyrox commented on code in PR #1435:
URL: 
https://github.com/apache/datafusion-sqlparser-rs/pull/1435#discussion_r1793098167


##
src/ast/spans.rs:
##
@@ -0,0 +1,1768 @@
+use core::iter;
+
+use crate::tokenizer::Span;
+
+use super::{
+AlterColumnOperation, AlterIndexOperation, AlterTableOperation, Array, 
Assignment,
+AssignmentTarget, CloseCursor, ClusteredIndex, ColumnDef, ColumnOption, 
ColumnOptionDef,
+ConflictTarget, ConstraintCharacteristics, CopySource, CreateIndex, 
CreateTable,
+CreateTableOptions, Cte, Delete, DoUpdate, ExceptSelectItem, 
ExcludeSelectItem, Expr,
+ExprWithAlias, FromTable, Function, FunctionArg, FunctionArgExpr, 
FunctionArgumentClause,
+FunctionArgumentList, FunctionArguments, GroupByExpr, HavingBound, 
IlikeSelectItem, Insert,
+Interpolate, InterpolateExpr, Join, JoinConstraint, JoinOperator, 
JsonPath, JsonPathElem,
+MatchRecognizePattern, Measure, ObjectName, OnConflict, OnConflictAction, 
OnInsert, OrderBy,
+OrderByExpr, Partition, PivotValueSource, ProjectionSelect, Query, 
ReferentialAction,
+RenameSelectItem, ReplaceSelectElement, ReplaceSelectItem, Select, 
SelectItem, SetExpr,
+SqlOption, Statement, Subscript, SymbolDefinition, TableAlias, 
TableConstraint, TableFactor,
+TableOptionsClustered, TableWithJoins, Use, Value, Values, ViewColumnDef,
+WildcardAdditionalOptions, With, WithFill,
+};
+
+/// A trait for AST nodes that have a source span for use in diagnostics.
+///
+/// Source spans are not guaranteed to be entirely accurate. They may
+/// be missing keywords or other tokens. Some nodes may not have a computable
+/// span at all, in which case they return `Span::empty()`.
+///
+/// Some impl blocks may contain doc comments with information
+/// on which nodes are missing spans.
+pub trait Spanned {
+/// Compute the source span for this AST node, by recursively
+/// combining the spans of its children.
+fn span(&self) -> Span;
+}
+
+impl Spanned for Query {
+fn span(&self) -> Span {
+self.body
+.span()
+.union_opt(&self.with.as_ref().map(|i| i.span()))
+}
+}
+
+impl Spanned for With {
+fn span(&self) -> Span {
+union_spans(
+core::iter::once(self.with_token.span)
+.chain(self.cte_tables.iter().map(|item| item.span())),
+)
+}
+}
+
+impl Spanned for Cte {
+fn span(&self) -> Span {
+union_spans(
+core::iter::once(self.alias.span())
+.chain(core::iter::once(self.query.span()))
+.chain(self.from.iter().map(|item| item.span))
+.chain(core::iter::once(self.closing_paren_token.span)),
+)
+}
+}
+
+/// # partial span
+///
+/// [SetExpr::Table] is not implemented.
+impl Spanned for SetExpr {
+fn span(&self) -> Span {
+match self {
+SetExpr::Select(select) => select.span(),
+SetExpr::Query(query) => query.span(),
+SetExpr::SetOperation {
+op: _,
+set_quantifier: _,
+left,
+right,
+} => left.span().union(&right.span()),
+SetExpr::Values(values) => values.span(),
+SetExpr::Insert(statement) => statement.span(),
+SetExpr::Table(_) => Span::empty(),
+SetExpr::Update(statement) => statement.span(),
+}
+}
+}
+
+impl Spanned for Values {
+fn span(&self) -> Span {
+union_spans(
+self.rows
+.iter()
+.map(|row| union_spans(row.iter().map(|expr| expr.span(,
+)
+}
+}
+
+/// # partial span
+///
+/// Missing spans:
+/// - [Statement::CopyIntoSnowflake]
+/// - [Statement::CreateSecret]
+/// - [Statement::CreateRole]
+/// - [Statement::AlterRole]
+/// - [Statement::AttachDatabase]
+/// - [Statement::AttachDuckDBDatabase]
+/// - [Statement::DetachDuckDBDatabase]
+/// - [Statement::Drop]
+/// - [Statement::DropFunction]
+/// - [Statement::DropProcedure]
+/// - [Statement::DropSecret]
+/// - [Statement::Declare]
+/// - [Statement::CreateExtension]
+/// - [Statement::Fetch]
+/// - [Statement::Flush]
+/// - [Statement::Discard]
+/// - [Statement::SetRole]
+/// - [Statement::SetVariable]
+/// - [Statement::SetTimeZone]
+/// - [Statement::SetNames]
+/// - [Statement::SetNamesDefault]
+/// - [Statement::ShowFunctions]
+/// - [Statement::ShowVariable]
+/// - [Statement::ShowStatus]
+/// - [Statement::ShowVariables]
+/// - [Statement::ShowCreate]
+/// - [Statement::ShowColumns]
+/// - [Statement::ShowTables]
+/// - [Statement::ShowCollation]
+/// - [Statement::StartTransaction]
+/// - [Statement::SetTransaction]
+/// - [Statement::Comment]
+/// - [Statement::Commit]
+/// - [Statement::Rollback]
+/// - [Statement::CreateSchema]
+/// - [Statement::CreateDatabase]
+/// - [Statement::CreateFunction]
+/// - [Statement::CreateTrigger]
+/// - [Statement::DropTrigger]
+/// - [Statement::CreateProcedure]

Re: [PR] Add "DROP TYPE" support. [datafusion-sqlparser-rs]

2024-10-09 Thread via GitHub


alamb merged PR #1461:
URL: https://github.com/apache/datafusion-sqlparser-rs/pull/1461


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: dev-unsubscr...@datafusion.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: dev-unsubscr...@datafusion.apache.org
For additional commands, e-mail: dev-h...@datafusion.apache.org



Re: [PR] Add "DROP TYPE" support. [datafusion-sqlparser-rs]

2024-10-09 Thread via GitHub


alamb commented on PR #1461:
URL: 
https://github.com/apache/datafusion-sqlparser-rs/pull/1461#issuecomment-2402430677

   🚀 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: dev-unsubscr...@datafusion.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: dev-unsubscr...@datafusion.apache.org
For additional commands, e-mail: dev-h...@datafusion.apache.org



Re: [PR] chore: Add asf.yaml [datafusion-sqlparser-rs]

2024-10-09 Thread via GitHub


tisonkun commented on code in PR #1463:
URL: 
https://github.com/apache/datafusion-sqlparser-rs/pull/1463#discussion_r1793132420


##
.asf.yaml:
##
@@ -0,0 +1,43 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+# This file controls the settings of this repository
+#
+# See more details at
+# https://cwiki.apache.org/confluence/display/INFRA/Git+-+.asf.yaml+features
+
+notifications:
+  commits: comm...@datafusion.apache.org
+  issues: git...@datafusion.apache.org
+  pullrequests: git...@datafusion.apache.org
+  jira_options: link label worklog

Review Comment:
   I don't think this repo associated with a JIRA project so we may not need 
this line.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: dev-unsubscr...@datafusion.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: dev-unsubscr...@datafusion.apache.org
For additional commands, e-mail: dev-h...@datafusion.apache.org



Re: [PR] Implement `Spanned` to retrieve source locations on AST nodes [datafusion-sqlparser-rs]

2024-10-09 Thread via GitHub


Nyrox commented on code in PR #1435:
URL: 
https://github.com/apache/datafusion-sqlparser-rs/pull/1435#discussion_r1793045556


##
src/ast/mod.rs:
##
@@ -156,6 +179,30 @@ impl Ident {
 Ident {
 value: value.into(),
 quote_style: Some(quote),
+span: Span::empty(),
+}
+}
+
+pub fn with_span(span: Span, value: S) -> Self
+where
+S: Into,
+{
+Ident {
+value: value.into(),
+quote_style: None,
+span,
+}
+}
+
+pub fn with_quote_and_span(quote: char, span: Span, value: S) -> Self
+where
+S: Into,
+{
+assert!(quote == '\'' || quote == '"' || quote == '`' || quote == '[');

Review Comment:
   Removing it sounds good to me, I just wanted to have the same behaviour as 
`Ident::with_quote` which still has the check, so it should prob. be removed in 
both places then



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: dev-unsubscr...@datafusion.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: dev-unsubscr...@datafusion.apache.org
For additional commands, e-mail: dev-h...@datafusion.apache.org



Re: [PR] chore: Add asf.yaml [datafusion-sqlparser-rs]

2024-10-09 Thread via GitHub


Xuanwo commented on code in PR #1463:
URL: 
https://github.com/apache/datafusion-sqlparser-rs/pull/1463#discussion_r1793137226


##
.asf.yaml:
##
@@ -0,0 +1,43 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+# This file controls the settings of this repository
+#
+# See more details at
+# https://cwiki.apache.org/confluence/display/INFRA/Git+-+.asf.yaml+features
+
+notifications:
+  commits: comm...@datafusion.apache.org
+  issues: git...@datafusion.apache.org
+  pullrequests: git...@datafusion.apache.org
+  jira_options: link label worklog

Review Comment:
   ```suggestion
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: dev-unsubscr...@datafusion.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: dev-unsubscr...@datafusion.apache.org
For additional commands, e-mail: dev-h...@datafusion.apache.org



Re: [PR] chore: Add asf.yaml [datafusion-sqlparser-rs]

2024-10-09 Thread via GitHub


Xuanwo commented on code in PR #1463:
URL: 
https://github.com/apache/datafusion-sqlparser-rs/pull/1463#discussion_r1793137713


##
.asf.yaml:
##
@@ -0,0 +1,43 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+# This file controls the settings of this repository
+#
+# See more details at
+# https://cwiki.apache.org/confluence/display/INFRA/Git+-+.asf.yaml+features
+
+notifications:
+  commits: comm...@datafusion.apache.org
+  issues: git...@datafusion.apache.org
+  pullrequests: git...@datafusion.apache.org
+  jira_options: link label worklog

Review Comment:
   Thanks, removed.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: dev-unsubscr...@datafusion.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: dev-unsubscr...@datafusion.apache.org
For additional commands, e-mail: dev-h...@datafusion.apache.org



Re: [PR] chore: Add asf.yaml [datafusion-sqlparser-rs]

2024-10-09 Thread via GitHub


alamb commented on code in PR #1463:
URL: 
https://github.com/apache/datafusion-sqlparser-rs/pull/1463#discussion_r1793529354


##
.asf.yaml:
##
@@ -0,0 +1,42 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+# This file controls the settings of this repository
+#
+# See more details at
+# https://cwiki.apache.org/confluence/display/INFRA/Git+-+.asf.yaml+features
+
+notifications:
+  commits: comm...@datafusion.apache.org
+  issues: git...@datafusion.apache.org
+  pullrequests: git...@datafusion.apache.org
+github:
+  description: "Extensible SQL Lexer and Parser for Rust"
+  labels:
+- big-data
+- rust
+- sql
+  enabled_merge_buttons:
+squash: true
+merge: false
+rebase: false
+  features:
+issues: true
+  protected_branches:
+main:
+  required_pull_request_reviews:
+required_approving_review_count: 1

Review Comment:
   I would appreciate not putting branch protection on until we have enough 
committers who focus on sqlparser -- I am working on this
   
   ```suggestion
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: dev-unsubscr...@datafusion.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: dev-unsubscr...@datafusion.apache.org
For additional commands, e-mail: dev-h...@datafusion.apache.org



[PR] chore: Add asf.yaml [datafusion-sqlparser-rs]

2024-10-09 Thread via GitHub


Xuanwo opened a new pull request, #1463:
URL: https://github.com/apache/datafusion-sqlparser-rs/pull/1463

   This PR adds `asf.yaml` which mainly copied from 
https://github.com/apache/datafusion/blob/main/.asf.yaml
   
   After this change, we can avoid sending too many traffic to dev@d.a.o.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: dev-unsubscr...@datafusion.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: dev-unsubscr...@datafusion.apache.org
For additional commands, e-mail: dev-h...@datafusion.apache.org



Re: [PR] chore: Add asf.yaml [datafusion-sqlparser-rs]

2024-10-09 Thread via GitHub


coveralls commented on PR #1463:
URL: 
https://github.com/apache/datafusion-sqlparser-rs/pull/1463#issuecomment-2401717571

   ## Pull Request Test Coverage Report for [Build 
11251382349](https://coveralls.io/builds/70242932)
   
   
   ### Details
   
   * **0** of **0**   changed or added relevant lines in **0** files are 
covered.
   * No unchanged relevant lines lost coverage.
   * Overall coverage remained the same at **89.322%**
   
   ---
   
   
   
   |  Totals | [![Coverage 
Status](https://coveralls.io/builds/70242932/badge)](https://coveralls.io/builds/70242932)
 |
   | :-- | --: |
   | Change from base [Build 
11239696437](https://coveralls.io/builds/70229031): |  0.0% |
   | Covered Lines: | 30082 |
   | Relevant Lines: | 33678 |
   
   ---
   # 💛  - [Coveralls](https://coveralls.io)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: dev-unsubscr...@datafusion.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: dev-unsubscr...@datafusion.apache.org
For additional commands, e-mail: dev-h...@datafusion.apache.org



Re: [PR] Snowflake: support for extended column options in `CREATE TABLE` [datafusion-sqlparser-rs]

2024-10-09 Thread via GitHub


7phs commented on code in PR #1454:
URL: 
https://github.com/apache/datafusion-sqlparser-rs/pull/1454#discussion_r1793276815


##
src/ast/ddl.rs:
##
@@ -1098,17 +1098,116 @@ impl fmt::Display for ColumnOptionDef {
 }
 }
 
+#[derive(Debug, Clone, PartialEq, PartialOrd, Eq, Ord, Hash)]

Review Comment:
   I added a reasonable documentation for structs with links to a Snowflake doc.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: dev-unsubscr...@datafusion.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: dev-unsubscr...@datafusion.apache.org
For additional commands, e-mail: dev-h...@datafusion.apache.org



Re: [PR] Snowflake: support for extended column options in `CREATE TABLE` [datafusion-sqlparser-rs]

2024-10-09 Thread via GitHub


7phs commented on code in PR #1454:
URL: 
https://github.com/apache/datafusion-sqlparser-rs/pull/1454#discussion_r1793277714


##
src/parser/mod.rs:
##
@@ -6193,24 +6186,160 @@ impl<'a> Parser<'a> {
 && dialect_of!(self is MySqlDialect | SQLiteDialect | 
DuckDbDialect | GenericDialect)
 {
 self.parse_optional_column_option_as()
+} else if self.parse_keyword(Keyword::AUTOINCREMENT)
+&& dialect_of!(self is SnowflakeDialect | SQLiteDialect |  
GenericDialect)
+{
+if dialect_of!(self is SnowflakeDialect) {
+self.prev_token();
+return 
self.parse_snowflake_autoincrement_or_identity_option_column();
+}
+
+// Support AUTOINCREMENT for SQLite
+Ok(Some(ColumnOption::DialectSpecific(vec![
+Token::make_keyword("AUTOINCREMENT"),
+])))
 } else if self.parse_keyword(Keyword::IDENTITY)
-&& dialect_of!(self is MsSqlDialect | GenericDialect)
+&& dialect_of!(self is MsSqlDialect | SnowflakeDialect | 
GenericDialect)
 {
-let property = if self.consume_token(&Token::LParen) {
+if dialect_of!(self is SnowflakeDialect) {
+self.prev_token();
+return 
self.parse_snowflake_autoincrement_or_identity_option_column();
+}
+
+let parameters = if self.consume_token(&Token::LParen) {
 let seed = self.parse_number()?;
 self.expect_token(&Token::Comma)?;
 let increment = self.parse_number()?;
 self.expect_token(&Token::RParen)?;
 
-Some(IdentityProperty { seed, increment })
+Some(IdentityFormat::FunctionCall(IdentityParameters {
+seed,
+increment,
+}))
 } else {
 None
 };
-Ok(Some(ColumnOption::Identity(property)))
+Ok(Some(ColumnOption::Identity(Identity::Identity(
+IdentityProperty {
+parameters,
+order: None,
+},
+
+} else if ((self.parse_keyword(Keyword::WITH)
+&& self
+.parse_one_of_keywords(&[Keyword::MASKING, 
Keyword::PROJECTION])
+.is_some())
+|| self
+.parse_one_of_keywords(&[Keyword::MASKING, 
Keyword::PROJECTION])
+.is_some())
+&& dialect_of!(self is SnowflakeDialect | GenericDialect)
+{
+self.prev_token();
+let Some(policy) = self.parse_snowflake_column_policy()? else {
+return Ok(None);
+};
+Ok(Some(ColumnOption::Policy(policy)))
+} else if self.parse_keywords(&[Keyword::TAG])
+&& dialect_of!(self is SnowflakeDialect | GenericDialect)
+{
+self.expect_token(&Token::LParen)?;
+let tags = self.parse_comma_separated(Self::parse_tag)?;
+self.expect_token(&Token::RParen)?;
+
+Ok(Some(ColumnOption::Tags(tags)))
+} else {
+Ok(None)
+}
+}
+
+fn parse_snowflake_autoincrement_or_identity_option_column(

Review Comment:
   I added a reasonable documentation for structs with links to a Snowflake doc 
and moved this function into `snowflake` dialect implementation.



##
src/ast/ddl.rs:
##
@@ -1098,17 +1098,116 @@ impl fmt::Display for ColumnOptionDef {
 }
 }
 
+#[derive(Debug, Clone, PartialEq, PartialOrd, Eq, Ord, Hash)]
+#[cfg_attr(feature = "serde", derive(Serialize, Deserialize))]
+#[cfg_attr(feature = "visitor", derive(Visit, VisitMut))]
+pub enum Identity {
+Autoincrement(IdentityProperty),
+Identity(IdentityProperty),
+}
+
 #[derive(Debug, Clone, PartialEq, PartialOrd, Eq, Ord, Hash)]
 #[cfg_attr(feature = "serde", derive(Serialize, Deserialize))]
 #[cfg_attr(feature = "visitor", derive(Visit, VisitMut))]
 pub struct IdentityProperty {
+pub parameters: Option,
+pub order: Option,
+}
+
+#[derive(Debug, Clone, PartialEq, PartialOrd, Eq, Ord, Hash)]
+#[cfg_attr(feature = "serde", derive(Serialize, Deserialize))]
+#[cfg_attr(feature = "visitor", derive(Visit, VisitMut))]
+pub enum IdentityFormat {
+FunctionCall(IdentityParameters),
+StartAndIncrement(IdentityParameters),
+}
+
+#[derive(Debug, Clone, PartialEq, PartialOrd, Eq, Ord, Hash)]
+#[cfg_attr(feature = "serde", derive(Serialize, Deserialize))]
+#[cfg_attr(feature = "visitor", derive(Visit, VisitMut))]
+pub struct IdentityParameters {
 pub seed: Expr,
 pub increment: Expr,
 }
 
-impl fmt::Display for IdentityProperty {
+#[derive(Debug, Clone, PartialEq, PartialOrd, Eq, Ord, Hash)]
+#[cfg_attr(feature = "serde", derive(Serialize, Deserialize))]
+#[cfg_attr(feature = "visitor", derive(Visit, VisitMut))]
+pub enum IdentityOrder {
+Order,
+

Re: [PR] Snowflake: support for extended column options in `CREATE TABLE` [datafusion-sqlparser-rs]

2024-10-09 Thread via GitHub


7phs commented on code in PR #1454:
URL: 
https://github.com/apache/datafusion-sqlparser-rs/pull/1454#discussion_r1793281102


##
src/parser/mod.rs:
##
@@ -6193,24 +6186,160 @@ impl<'a> Parser<'a> {
 && dialect_of!(self is MySqlDialect | SQLiteDialect | 
DuckDbDialect | GenericDialect)
 {
 self.parse_optional_column_option_as()
+} else if self.parse_keyword(Keyword::AUTOINCREMENT)
+&& dialect_of!(self is SnowflakeDialect | SQLiteDialect |  
GenericDialect)
+{
+if dialect_of!(self is SnowflakeDialect) {

Review Comment:
   I moved all snowflake specific code of column options parsing into snowflake 
dialect implementation.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: dev-unsubscr...@datafusion.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: dev-unsubscr...@datafusion.apache.org
For additional commands, e-mail: dev-h...@datafusion.apache.org



Re: [PR] Snowflake: support for extended column options in `CREATE TABLE` [datafusion-sqlparser-rs]

2024-10-09 Thread via GitHub


7phs commented on code in PR #1454:
URL: 
https://github.com/apache/datafusion-sqlparser-rs/pull/1454#discussion_r1793279817


##
src/ast/ddl.rs:
##
@@ -1291,11 +1406,13 @@ impl fmt::Display for ColumnOption {
 write!(f, "OPTIONS({})", display_comma_separated(options))
 }
 Identity(parameters) => {
-write!(f, "IDENTITY")?;
-if let Some(parameters) = parameters {
-write!(f, "({parameters})")?;
-}
-Ok(())
+write!(f, "{parameters}")
+}
+Policy(parameters) => {
+write!(f, "{parameters}")
+}
+Tags(tags) => {
+write!(f, "WITH TAG ({})", display_comma_separated(tags))

Review Comment:
   Current implementation of similar options of table has no this flag - it is 
a reason that I skipped it.
   
   I added `with` flag to existing structs and updated tags columns option 
definition.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: dev-unsubscr...@datafusion.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: dev-unsubscr...@datafusion.apache.org
For additional commands, e-mail: dev-h...@datafusion.apache.org



Re: [PR] Snowflake: support for extended column options in `CREATE TABLE` [datafusion-sqlparser-rs]

2024-10-09 Thread via GitHub


7phs commented on code in PR #1454:
URL: 
https://github.com/apache/datafusion-sqlparser-rs/pull/1454#discussion_r1793281102


##
src/parser/mod.rs:
##
@@ -6193,24 +6186,160 @@ impl<'a> Parser<'a> {
 && dialect_of!(self is MySqlDialect | SQLiteDialect | 
DuckDbDialect | GenericDialect)
 {
 self.parse_optional_column_option_as()
+} else if self.parse_keyword(Keyword::AUTOINCREMENT)
+&& dialect_of!(self is SnowflakeDialect | SQLiteDialect |  
GenericDialect)
+{
+if dialect_of!(self is SnowflakeDialect) {

Review Comment:
   I moved all snowflake specific code of column options parsing into snowflake 
dialect implementation and added a method of `parse_column_option` into the 
`Dialect` trait.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: dev-unsubscr...@datafusion.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: dev-unsubscr...@datafusion.apache.org
For additional commands, e-mail: dev-h...@datafusion.apache.org



Re: [PR] Snowflake: support for extended column options in `CREATE TABLE` [datafusion-sqlparser-rs]

2024-10-09 Thread via GitHub


7phs commented on code in PR #1454:
URL: 
https://github.com/apache/datafusion-sqlparser-rs/pull/1454#discussion_r1793283035


##
src/parser/mod.rs:
##
@@ -6193,24 +6186,160 @@ impl<'a> Parser<'a> {
 && dialect_of!(self is MySqlDialect | SQLiteDialect | 
DuckDbDialect | GenericDialect)
 {
 self.parse_optional_column_option_as()
+} else if self.parse_keyword(Keyword::AUTOINCREMENT)
+&& dialect_of!(self is SnowflakeDialect | SQLiteDialect |  
GenericDialect)
+{
+if dialect_of!(self is SnowflakeDialect) {
+self.prev_token();
+return 
self.parse_snowflake_autoincrement_or_identity_option_column();
+}
+
+// Support AUTOINCREMENT for SQLite
+Ok(Some(ColumnOption::DialectSpecific(vec![
+Token::make_keyword("AUTOINCREMENT"),
+])))
 } else if self.parse_keyword(Keyword::IDENTITY)
-&& dialect_of!(self is MsSqlDialect | GenericDialect)
+&& dialect_of!(self is MsSqlDialect | SnowflakeDialect | 
GenericDialect)
 {
-let property = if self.consume_token(&Token::LParen) {
+if dialect_of!(self is SnowflakeDialect) {
+self.prev_token();
+return 
self.parse_snowflake_autoincrement_or_identity_option_column();
+}
+
+let parameters = if self.consume_token(&Token::LParen) {
 let seed = self.parse_number()?;
 self.expect_token(&Token::Comma)?;
 let increment = self.parse_number()?;
 self.expect_token(&Token::RParen)?;
 
-Some(IdentityProperty { seed, increment })
+Some(IdentityFormat::FunctionCall(IdentityParameters {
+seed,
+increment,
+}))
 } else {
 None
 };
-Ok(Some(ColumnOption::Identity(property)))
+Ok(Some(ColumnOption::Identity(Identity::Identity(
+IdentityProperty {
+parameters,
+order: None,
+},
+
+} else if ((self.parse_keyword(Keyword::WITH)
+&& self
+.parse_one_of_keywords(&[Keyword::MASKING, 
Keyword::PROJECTION])
+.is_some())
+|| self
+.parse_one_of_keywords(&[Keyword::MASKING, 
Keyword::PROJECTION])
+.is_some())

Review Comment:
   I used this suggestion related to `with` parsing in common column option 
parsing function and in snowflake specific one.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: dev-unsubscr...@datafusion.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: dev-unsubscr...@datafusion.apache.org
For additional commands, e-mail: dev-h...@datafusion.apache.org



Re: [PR] Snowflake: support for extended column options in `CREATE TABLE` [datafusion-sqlparser-rs]

2024-10-09 Thread via GitHub


7phs commented on code in PR #1454:
URL: 
https://github.com/apache/datafusion-sqlparser-rs/pull/1454#discussion_r1793354187


##
tests/sqlparser_snowflake.rs:
##
@@ -525,6 +525,307 @@ fn test_snowflake_single_line_tokenize() {
 assert_eq!(expected, tokens);
 }
 
+#[test]
+fn test_snowflake_create_table_with_autoincrement_columns() {
+let sql = concat!(
+"CREATE TABLE my_table (",
+"a INT AUTOINCREMENT ORDER, ",
+"b INT AUTOINCREMENT(100, 1) NOORDER, ",
+"c INT IDENTITY, ",
+"d INT IDENTITY START 100 INCREMENT 1 ORDER",
+")"
+);
+// it is a snowflake specific options (AUTOINCREMENT/IDENTITY)
+match snowflake().verified_stmt(sql) {
+Statement::CreateTable(CreateTable { columns, .. }) => {
+assert_eq!(
+columns,
+vec![
+ColumnDef {
+name: "a".into(),
+data_type: DataType::Int(None),
+collation: None,
+options: vec![ColumnOptionDef {
+name: None,
+option: 
ColumnOption::Identity(Identity::Autoincrement(
+IdentityProperty {
+parameters: None,
+order: Some(IdentityOrder::Order),
+}
+))
+}]
+},
+ColumnDef {
+name: "b".into(),
+data_type: DataType::Int(None),
+collation: None,
+options: vec![ColumnOptionDef {
+name: None,
+option: 
ColumnOption::Identity(Identity::Autoincrement(
+IdentityProperty {
+#[cfg(not(feature = "bigdecimal"))]
+parameters: 
Some(IdentityFormat::FunctionCall(
+IdentityParameters {
+seed: Expr::Value(Value::Number(
+"100".to_string(),
+false
+)),
+increment: 
Expr::Value(Value::Number(
+"1".to_string(),
+false
+)),
+}
+)),
+#[cfg(feature = "bigdecimal")]
+parameters: 
Some(IdentityFormat::FunctionCall(
+IdentityParameters {
+seed: Expr::Value(Value::Number(
+
bigdecimal::BigDecimal::from(100),
+false,
+)),
+increment: 
Expr::Value(Value::Number(
+
bigdecimal::BigDecimal::from(1),
+false,
+)),
+}
+)),
+order: Some(IdentityOrder::Noorder),
+}
+))
+}]
+},
+ColumnDef {
+name: "c".into(),
+data_type: DataType::Int(None),
+collation: None,
+options: vec![ColumnOptionDef {
+name: None,
+option: 
ColumnOption::Identity(Identity::Identity(IdentityProperty {
+parameters: None,
+order: None,
+}))
+}]
+},
+ColumnDef {
+name: "d".into(),
+data_type: DataType::Int(None),
+collation: None,
+options: vec![ColumnOptionDef {
+name: None,
+option: 
ColumnOption::Identity(Identity::Identity(IdentityProperty {
+#[cfg(not(feature = "bigdecimal"))]
+parameters: 
Some(IdentityFormat::StartAndIncrement(
+IdentityParameters {
+seed: 
Expr::Value(Value::Number("100".to_stri

Re: [PR] Snowflake: support for extended column options in `CREATE TABLE` [datafusion-sqlparser-rs]

2024-10-09 Thread via GitHub


7phs commented on code in PR #1454:
URL: 
https://github.com/apache/datafusion-sqlparser-rs/pull/1454#discussion_r1793352226


##
tests/sqlparser_snowflake.rs:
##
@@ -525,6 +525,307 @@ fn test_snowflake_single_line_tokenize() {
 assert_eq!(expected, tokens);
 }
 
+#[test]
+fn test_snowflake_create_table_with_autoincrement_columns() {
+let sql = concat!(
+"CREATE TABLE my_table (",
+"a INT AUTOINCREMENT ORDER, ",
+"b INT AUTOINCREMENT(100, 1) NOORDER, ",
+"c INT IDENTITY, ",
+"d INT IDENTITY START 100 INCREMENT 1 ORDER",
+")"
+);
+// it is a snowflake specific options (AUTOINCREMENT/IDENTITY)
+match snowflake().verified_stmt(sql) {
+Statement::CreateTable(CreateTable { columns, .. }) => {
+assert_eq!(
+columns,
+vec![
+ColumnDef {
+name: "a".into(),
+data_type: DataType::Int(None),
+collation: None,
+options: vec![ColumnOptionDef {
+name: None,
+option: 
ColumnOption::Identity(Identity::Autoincrement(
+IdentityProperty {
+parameters: None,
+order: Some(IdentityOrder::Order),
+}
+))
+}]
+},
+ColumnDef {
+name: "b".into(),
+data_type: DataType::Int(None),
+collation: None,
+options: vec![ColumnOptionDef {
+name: None,
+option: 
ColumnOption::Identity(Identity::Autoincrement(
+IdentityProperty {
+#[cfg(not(feature = "bigdecimal"))]
+parameters: 
Some(IdentityFormat::FunctionCall(
+IdentityParameters {
+seed: Expr::Value(Value::Number(
+"100".to_string(),
+false
+)),
+increment: 
Expr::Value(Value::Number(
+"1".to_string(),
+false
+)),
+}
+)),
+#[cfg(feature = "bigdecimal")]
+parameters: 
Some(IdentityFormat::FunctionCall(
+IdentityParameters {
+seed: Expr::Value(Value::Number(
+
bigdecimal::BigDecimal::from(100),
+false,
+)),
+increment: 
Expr::Value(Value::Number(
+
bigdecimal::BigDecimal::from(1),
+false,
+)),
+}
+)),
+order: Some(IdentityOrder::Noorder),
+}
+))
+}]
+},
+ColumnDef {
+name: "c".into(),
+data_type: DataType::Int(None),
+collation: None,
+options: vec![ColumnOptionDef {
+name: None,
+option: 
ColumnOption::Identity(Identity::Identity(IdentityProperty {
+parameters: None,
+order: None,
+}))
+}]
+},
+ColumnDef {
+name: "d".into(),
+data_type: DataType::Int(None),
+collation: None,
+options: vec![ColumnOptionDef {
+name: None,
+option: 
ColumnOption::Identity(Identity::Identity(IdentityProperty {
+#[cfg(not(feature = "bigdecimal"))]

Review Comment:
   Applied for tests of `snowflake` and `ms sql`



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comme

[PR] fix for maybe_parse preventing parser from erroring on recursion limit [datafusion-sqlparser-rs]

2024-10-09 Thread via GitHub


tomershaniii opened a new pull request, #1464:
URL: https://github.com/apache/datafusion-sqlparser-rs/pull/1464

   (no comment)


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: dev-unsubscr...@datafusion.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: dev-unsubscr...@datafusion.apache.org
For additional commands, e-mail: dev-h...@datafusion.apache.org



Re: [PR] Implement `Spanned` to retrieve source locations on AST nodes [datafusion-sqlparser-rs]

2024-10-09 Thread via GitHub


Nyrox commented on code in PR #1435:
URL: 
https://github.com/apache/datafusion-sqlparser-rs/pull/1435#discussion_r1793057894


##
src/ast/spans.rs:
##
@@ -0,0 +1,1768 @@
+use core::iter;
+
+use crate::tokenizer::Span;
+
+use super::{
+AlterColumnOperation, AlterIndexOperation, AlterTableOperation, Array, 
Assignment,
+AssignmentTarget, CloseCursor, ClusteredIndex, ColumnDef, ColumnOption, 
ColumnOptionDef,
+ConflictTarget, ConstraintCharacteristics, CopySource, CreateIndex, 
CreateTable,
+CreateTableOptions, Cte, Delete, DoUpdate, ExceptSelectItem, 
ExcludeSelectItem, Expr,
+ExprWithAlias, FromTable, Function, FunctionArg, FunctionArgExpr, 
FunctionArgumentClause,
+FunctionArgumentList, FunctionArguments, GroupByExpr, HavingBound, 
IlikeSelectItem, Insert,
+Interpolate, InterpolateExpr, Join, JoinConstraint, JoinOperator, 
JsonPath, JsonPathElem,
+MatchRecognizePattern, Measure, ObjectName, OnConflict, OnConflictAction, 
OnInsert, OrderBy,
+OrderByExpr, Partition, PivotValueSource, ProjectionSelect, Query, 
ReferentialAction,
+RenameSelectItem, ReplaceSelectElement, ReplaceSelectItem, Select, 
SelectItem, SetExpr,
+SqlOption, Statement, Subscript, SymbolDefinition, TableAlias, 
TableConstraint, TableFactor,
+TableOptionsClustered, TableWithJoins, Use, Value, Values, ViewColumnDef,
+WildcardAdditionalOptions, With, WithFill,
+};
+
+/// A trait for AST nodes that have a source span for use in diagnostics.
+///
+/// Source spans are not guaranteed to be entirely accurate. They may
+/// be missing keywords or other tokens. Some nodes may not have a computable
+/// span at all, in which case they return `Span::empty()`.
+///
+/// Some impl blocks may contain doc comments with information
+/// on which nodes are missing spans.
+pub trait Spanned {
+/// Compute the source span for this AST node, by recursively
+/// combining the spans of its children.
+fn span(&self) -> Span;
+}
+
+impl Spanned for Query {
+fn span(&self) -> Span {
+self.body
+.span()
+.union_opt(&self.with.as_ref().map(|i| i.span()))
+}
+}
+
+impl Spanned for With {
+fn span(&self) -> Span {
+union_spans(
+core::iter::once(self.with_token.span)
+.chain(self.cte_tables.iter().map(|item| item.span())),
+)
+}
+}
+
+impl Spanned for Cte {
+fn span(&self) -> Span {
+union_spans(
+core::iter::once(self.alias.span())
+.chain(core::iter::once(self.query.span()))
+.chain(self.from.iter().map(|item| item.span))
+.chain(core::iter::once(self.closing_paren_token.span)),
+)
+}
+}
+
+/// # partial span
+///
+/// [SetExpr::Table] is not implemented.
+impl Spanned for SetExpr {
+fn span(&self) -> Span {
+match self {
+SetExpr::Select(select) => select.span(),
+SetExpr::Query(query) => query.span(),
+SetExpr::SetOperation {
+op: _,
+set_quantifier: _,
+left,
+right,
+} => left.span().union(&right.span()),
+SetExpr::Values(values) => values.span(),
+SetExpr::Insert(statement) => statement.span(),
+SetExpr::Table(_) => Span::empty(),
+SetExpr::Update(statement) => statement.span(),
+}
+}
+}
+
+impl Spanned for Values {
+fn span(&self) -> Span {
+union_spans(
+self.rows
+.iter()
+.map(|row| union_spans(row.iter().map(|expr| expr.span(,
+)
+}
+}
+
+/// # partial span
+///
+/// Missing spans:
+/// - [Statement::CopyIntoSnowflake]
+/// - [Statement::CreateSecret]
+/// - [Statement::CreateRole]
+/// - [Statement::AlterRole]
+/// - [Statement::AttachDatabase]
+/// - [Statement::AttachDuckDBDatabase]
+/// - [Statement::DetachDuckDBDatabase]
+/// - [Statement::Drop]
+/// - [Statement::DropFunction]
+/// - [Statement::DropProcedure]
+/// - [Statement::DropSecret]
+/// - [Statement::Declare]
+/// - [Statement::CreateExtension]
+/// - [Statement::Fetch]
+/// - [Statement::Flush]
+/// - [Statement::Discard]
+/// - [Statement::SetRole]
+/// - [Statement::SetVariable]
+/// - [Statement::SetTimeZone]
+/// - [Statement::SetNames]
+/// - [Statement::SetNamesDefault]
+/// - [Statement::ShowFunctions]
+/// - [Statement::ShowVariable]
+/// - [Statement::ShowStatus]
+/// - [Statement::ShowVariables]
+/// - [Statement::ShowCreate]
+/// - [Statement::ShowColumns]
+/// - [Statement::ShowTables]
+/// - [Statement::ShowCollation]
+/// - [Statement::StartTransaction]
+/// - [Statement::SetTransaction]
+/// - [Statement::Comment]
+/// - [Statement::Commit]
+/// - [Statement::Rollback]
+/// - [Statement::CreateSchema]
+/// - [Statement::CreateDatabase]
+/// - [Statement::CreateFunction]
+/// - [Statement::CreateTrigger]
+/// - [Statement::DropTrigger]
+/// - [Statement::CreateProcedure]