parthchandra commented on code in PR #1398:
URL: https://github.com/apache/datafusion-comet/pull/1398#discussion_r1956903894


##########
spark/src/main/scala/org/apache/comet/DataTypeSupport.scala:
##########
@@ -37,7 +37,7 @@ trait DataTypeSupport {
 
   private def isGloballySupported(dt: DataType): Boolean = dt match {
     case ByteType | ShortType
-        if CometSparkSessionExtensions.isComplexTypeReaderEnabled(SQLConf.get) 
&&
+        if CometSparkSessionExtensions.usingDataFusionParquetExec(SQLConf.get) 
&&

Review Comment:
   Short answer: no I don't think that is possible here.
   The DataType here is a Spark type which only has ByteType and ShortType. 
Both are signed types (Spark has no unsigned types). Spark support reading 
Parquet unsigned types into Spark signed types so both uint_8/uint_16 and 
int_8/int_16 get read into ByteType/ShortType.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: github-unsubscr...@datafusion.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: github-unsubscr...@datafusion.apache.org
For additional commands, e-mail: github-h...@datafusion.apache.org

Reply via email to