huaxingao commented on PR #50275: URL: https://github.com/apache/spark/pull/50275#issuecomment-2727092581
@pan3793 When setting the [option](https://github.com/apache/spark/blob/master/sql/api/src/main/scala/org/apache/spark/sql/DataFrameWriter.scala#L87), Spark doesn't change the key to lower case. When creating a [DataSourceV2Relation](https://github.com/apache/spark/blob/master/sql/core/src/main/scala/org/apache/spark/sql/classic/DataFrameWriter.scala#L186), the keys in dsOption are converted to lower case. I think there are two ways to fix the problem, the first way is as what i did in the PR, the second way is to convert key to lower case in [DataFrameWriter.option](https://github.com/apache/spark/blob/master/sql/api/src/main/scala/org/apache/spark/sql/DataFrameWriter.scala#L87). The second way doesn't work, because in the iceberg test ``` df2.select("id", "data") .sort("data") .write() .format("org.apache.iceberg.spark.source.ManualSource") .option(ManualSource.TABLE_NAME, manualTableName) ``` [ManualSource.TABLE_NAME](https://github.com/apache/iceberg/blob/main/spark/v3.4/spark/src/test/java/org/apache/iceberg/spark/source/ManualSource.java#L33) is upper case. If calling .option(ManualSource.TABLE_NAME, manualTableName) changing the key to lower case, I will get ``` Missing property TABLE_NAME java.lang.IllegalArgumentException: Missing property TABLE_NAME at org.apache.iceberg.relocated.com.google.common.base.Preconditions.checkArgument(Preconditions.java:143) at org.apache.iceberg.spark.source.ManualSource.getTable(ManualSource.java:64) at org.apache.spark.sql.execution.datasources.v2.DataSourceV2Utils$.getTableFromProvider(DataSourceV2Utils.scala:98) ``` -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org