HeartSaVioR commented on code in PR #49277:
URL: https://github.com/apache/spark/pull/49277#discussion_r1916138007


##########
sql/core/src/main/scala/org/apache/spark/sql/avro/SchemaConverters.scala:
##########
@@ -374,67 +374,29 @@ object SchemaConverters extends Logging {
     }
   }
 
-  private def getDefaultValue(dataType: DataType): Any = {
-    def createNestedDefault(st: StructType): java.util.HashMap[String, Any] = {
-      val defaultMap = new java.util.HashMap[String, Any]()
-      st.fields.foreach { field =>
-        field.dataType match {
-          case nested: StructType =>
-            // For nested structs, recursively create the default structure
-            defaultMap.put(field.name, createNestedDefault(nested))
-          case _ =>
-            // For leaf fields, use null
-            defaultMap.put(field.name, null)
-        }
+  private def createDefaultStruct(st: StructType): java.util.HashMap[String, 
Any] = {

Review Comment:
   I guess we have resolved this offline - having value as null for struct type 
does not incur any problem, even though that struct type has non-nullable 
nested column. This is handled in Spark.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to