HeartSaVioR commented on code in PR #49277:
URL: https://github.com/apache/spark/pull/49277#discussion_r1915917226


##########
sql/core/src/main/scala/org/apache/spark/sql/avro/SchemaConverters.scala:
##########
@@ -374,67 +374,29 @@ object SchemaConverters extends Logging {
     }
   }
 
-  private def getDefaultValue(dataType: DataType): Any = {
-    def createNestedDefault(st: StructType): java.util.HashMap[String, Any] = {
-      val defaultMap = new java.util.HashMap[String, Any]()
-      st.fields.foreach { field =>
-        field.dataType match {
-          case nested: StructType =>
-            // For nested structs, recursively create the default structure
-            defaultMap.put(field.name, createNestedDefault(nested))
-          case _ =>
-            // For leaf fields, use null
-            defaultMap.put(field.name, null)
-        }
+  private def createDefaultStruct(st: StructType): java.util.HashMap[String, 
Any] = {

Review Comment:
   Please correct me if I'm missing something. Except the top StructType (which 
is not a nested one), I wonder why this still needs to create all nested fields 
with default null. 
   
   Let's say, something like:
   
   ```
   - record1
     - field1
     - field2
   - record2
     - field1
     - field2
   ```
   
   Will Avro disallow simply set record1 and record2 be null? This is possible 
with Spark SQL Row, so I wonder whether this is to fit with Avro, or something 
we can further simplify.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to