jackylee-ch opened a new pull request, #50754: URL: https://github.com/apache/spark/pull/50754
### What changes were proposed in this pull request? In [SPARK-20211](https://issues.apache.org/jira/browse/SPARK-20211), we addressed a fix with BigDecimal type conversion exceptions. However, the CatalystTypeConverters.convertToCatalyst method was not updated accordingly, causing users to still encounter exceptions when attempting to convert BigDecimal types. Below is a reproducible example case for this problem. ``` CatalystTypeConverters.convertToCatalyst(BigDecimal("0.01")) ``` ``` Decimal scale (2) cannot be greater than precision (1). org.apache.spark.sql.AnalysisException: Decimal scale (2) cannot be greater than precision (1). at org.apache.spark.sql.errors.DataTypeErrors$.decimalCannotGreaterThanPrecisionError(DataTypeErrors.scala:122) at org.apache.spark.sql.types.DecimalType.<init>(DecimalType.scala:46) at org.apache.spark.sql.catalyst.CatalystTypeConverters$.convertToCatalyst(CatalystTypeConverters.scala:578) at org.apache.spark.sql.catalyst.CatalystTypeConvertersSuite.$anonfun$new$18(CatalystTypeConvertersSuite.scala:159) ``` ### Does this PR introduce _any_ user-facing change? No. ### How was this patch tested? Newly added test. ### Was this patch authored or co-authored using generative AI tooling? No. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org