cloud-fan commented on code in PR #50959:
URL: https://github.com/apache/spark/pull/50959#discussion_r2113185354


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/ResolveDefaultColumnsUtil.scala:
##########
@@ -432,18 +432,34 @@ object ResolveDefaultColumns extends QueryErrorsBase
       targetType: DataType,
       colName: String): Option[Expression] = {
     expr match {
-      case l: Literal if !Seq(targetType, l.dataType).exists(_ match {
+      case l: Literal => defaultValueFromWiderType(l, targetType, colName)
+      case _ => None
+    }
+  }
+
+  /**
+   * If the provided default value is a literal of a wider type than the 
target column,
+   * but the literal value fits within the narrower type, just coerce it for 
convenience.
+   * Exclude boolean/array/struct/map types from consideration for this type 
coercion to
+   * avoid surprising behavior like interpreting "false" as integer zero.
+   */
+  def defaultValueFromWiderType(
+      expr: Expression,
+      targetType: DataType,
+      colName: String): Option[Expression] = {
+    expr match {
+      case e if !Seq(targetType, e.dataType).exists(_ match {
         case _: BooleanType | _: ArrayType | _: StructType | _: MapType => true
         case _ => false
       }) =>
-        val casted = Cast(l, targetType, Some(conf.sessionLocalTimeZone), 
evalMode = EvalMode.TRY)
+        val casted = Cast(e, targetType, Some(conf.sessionLocalTimeZone), 
evalMode = EvalMode.TRY)

Review Comment:
   This means that the input `expr` must be foldable, but we didn't check it.
   
   I don't think we need two methods. We can just extend the existing 
`defaultValueFromWiderTypeLiteral`. But instead of matching `Literal`, we match 
foldable expressions.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to