cloud-fan commented on code in PR #50913: URL: https://github.com/apache/spark/pull/50913#discussion_r2095603964
########## sql/core/src/test/scala/org/apache/spark/sql/CsvFunctionsSuite.scala: ########## @@ -824,8 +824,8 @@ class CsvFunctionsSuite extends QueryTest with SharedSparkSession { parameters = Map("schema" -> "\"STRUCT<a: VARIANT, b: VARIANT>\"")) // In singleVariantColumn mode, from_csv normally treats all inputs as valid. The only exception - // case is the input exceeds the variant size limit (16MiB). - val largeInput = "a" * (16 * 1024 * 1024) + // case is the input exceeds the variant size limit (128MiB). + val largeInput = "a" * (128 * 1024 * 1024) Review Comment: Maybe the GA machines are not powerful enough to run tests that take 128 MB. One idea is to have a hardcoded testing limit of 16 MB. e.g. ``` public static final int SIZE_LIMIT = if (System.getenv("SPARK_TESTING") != null) U24_MAX + 1 else 128 * 1024 * 1024; ``` -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org