andygrove commented on code in PR #1376:
URL: https://github.com/apache/datafusion-comet/pull/1376#discussion_r1945884758
##########
spark/src/test/scala/org/apache/comet/CometExpressionSuite.scala:
##########
@@ -125,6 +125,26 @@ class CometExpressionSuite extends CometTestBase with
AdaptiveSparkPlanHelper {
}
}
+ test("uint data type support") {
+ Seq(true, false).foreach { dictionaryEnabled =>
+ Seq(Byte.MaxValue, Short.MaxValue).foreach { valueRanges =>
+ {
+ withTempDir { dir =>
+ val path = new Path(dir.toURI.toString, "testuint.parquet")
+ makeParquetFileAllTypes(path, dictionaryEnabled =
dictionaryEnabled, valueRanges + 1)
+ withParquetTable(path.toString, "tbl") {
+ if
(CometSparkSessionExtensions.isComplexTypeReaderEnabled(conf)) {
+ checkSparkAnswer("select _9, _10 FROM tbl order by _11")
Review Comment:
Do we already have logic to fall back to Spark when the complex type reader
is enabled and when the query references uint Parquet fields?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]