LuciferYang commented on code in PR #51002:
URL: https://github.com/apache/spark/pull/51002#discussion_r2122632392


##########
sql/core/src/test/scala/org/apache/spark/sql/connector/DataSourceV2SQLSuite.scala:
##########
@@ -645,8 +645,21 @@ class DataSourceV2SQLSuiteV1Filter
       assert(replaced.columns.length === 1,
         "Replaced table should have new schema.")
       val actual = replaced.columns.head
-      val expected = ColumnV2.create("id", LongType, false, null,
-        new ColumnDefaultValue("41 + 1", LiteralValue(42L, LongType)), null)
+      val expected = ColumnV2.create(

Review Comment:
   This tests will fail in Non ANSI mode:
   
   - https://github.com/apache/spark/actions/runs/15406491464/job/43350133331
   
   
![image](https://github.com/user-attachments/assets/627e3cdc-711b-4264-a4fb-ad6bf6257011)
   
   
   We can execute the following command to reproduce the issue locally. 
   
   ```
   SPARK_ANSI_SQL_MODE=false build/sbt "sql/testOnly 
org.apache.spark.sql.connector.DataSourceV2SQLSuiteV1Filter"
   ```
   
   
   ```
   [info] - ReplaceTable: Erases the table contents and changes the metadata 
*** FAILED *** (30 milliseconds)
   [info]   ColumnImpl("id", LongType, false, null, ColumnDefaultValue{sql=41 + 
1, expression=null, value=42}, null, null, null) did not equal ColumnImpl("id", 
LongType, false, null, ColumnDefaultValue{sql=41 + 1, expression=CAST(41 + 1 AS 
long), value=42}, null, null, null) Replaced table should have new schema with 
DEFAULT column metadata. (DataSourceV2SQLSuite.scala:663)
   [info]   Analysis:
   [info]   ColumnImpl(defaultValue: ColumnDefaultValue{sql=41 + 1, 
expression=null, value=42} -> ColumnDefaultValue{sql=41 + 1, expression=CAST(41 
+ 1 AS long), value=42})
   [info]   org.scalatest.exceptions.TestFailedException:
   [info]   at 
org.scalatest.Assertions.newAssertionFailedException(Assertions.scala:472)
   [info]   at 
org.scalatest.Assertions.newAssertionFailedException$(Assertions.scala:471)
   [info]   at 
org.scalatest.Assertions$.newAssertionFailedException(Assertions.scala:1231)
   [info]   at 
org.scalatest.Assertions$AssertionsHelper.macroAssert(Assertions.scala:1295)
   [info]   at 
org.apache.spark.sql.connector.DataSourceV2SQLSuiteV1Filter.$anonfun$new$65(DataSourceV2SQLSuite.scala:663)
   [info]   at 
org.apache.spark.sql.catalyst.SQLConfHelper.withSQLConf(SQLConfHelper.scala:56)
   [info]   at 
org.apache.spark.sql.catalyst.SQLConfHelper.withSQLConf$(SQLConfHelper.scala:38)
   [info]   at 
org.apache.spark.sql.connector.InsertIntoTests.org$apache$spark$sql$test$SQLTestUtilsBase$$super$withSQLConf(InsertIntoTests.scala:42)
   [info]   at 
org.apache.spark.sql.test.SQLTestUtilsBase.withSQLConf(SQLTestUtils.scala:253)
   [info]   at 
org.apache.spark.sql.test.SQLTestUtilsBase.withSQLConf$(SQLTestUtils.scala:251)
   [info]   at 
org.apache.spark.sql.connector.InsertIntoTests.withSQLConf(InsertIntoTests.scala:42)
   [info]   at 
org.apache.spark.sql.connector.DataSourceV2SQLSuiteV1Filter.$anonfun$new$64(DataSourceV2SQLSuite.scala:639)
   ```
   
   
   Do you have time to take a look? @aokolnychyi 
   also cc @cloud-fan 



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to