aokolnychyi commented on code in PR #50792:
URL: https://github.com/apache/spark/pull/50792#discussion_r2082562046


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/connector/catalog/CatalogV2Util.scala:
##########
@@ -598,10 +598,28 @@ private[sql] object CatalogV2Util {
       //       data unchanged and let the data reader to return "exist 
default" for missing
       //       columns.
       val existingDefault = Literal(default.getValue.value(), 
default.getValue.dataType()).sql
-      
f.withExistenceDefaultValue(existingDefault).withCurrentDefaultValue(default.getSql)
+      
f.withExistenceDefaultValue(existingDefault).withCurrentDefaultValue(toSql(defaultValue))
     }.getOrElse(f)
   }
 
+  private def toSql(defaultValue: DefaultValue): String = {
+    if (defaultValue.getExpression != null) {

Review Comment:
   @cloud-fan, that could be an option. However, it doesn't produce Spark SQL 
dialect... for things like DAY_OF_WEEK and similar expressions. It seems to 
generate a user-friendly representation.
   
   The fundamental problem is that we have a Catalyst expression and we need to 
persist it in the struct field metadata. I wonder if we can avoid the need to 
put the expression in the metadata. It is going to be challenging for sure, as 
it requires changing `StructField`, which is a very stable API.
   
   Any thoughts?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to