kazuyukitanimura commented on code in PR #1529:
URL: https://github.com/apache/datafusion-comet/pull/1529#discussion_r1995947311


##########
spark/src/test/scala/org/apache/comet/CometExpressionSuite.scala:
##########
@@ -2716,17 +2705,15 @@ class CometExpressionSuite extends CometTestBase with 
AdaptiveSparkPlanHelper {
                   | from tbl1 t1 join tbl2 t2 on t1._id = t2._id
                   | order by t1._id""".stripMargin)
 
-              if (isSpark34Plus) {
-                // decimal support requires Spark 3.4 or later
-                checkSparkAnswerAndOperator("""
-                    |select
-                    | t1._12 div t2._12, div(t1._12, t2._12),
-                    | t1._15 div t2._15, div(t1._15, t2._15),
-                    | t1._16 div t2._16, div(t1._16, t2._16),
-                    | t1._17 div t2._17, div(t1._17, t2._17)
-                    | from tbl1 t1 join tbl2 t2 on t1._id = t2._id
-                    | order by t1._id""".stripMargin)
-              }
+              // decimal support requires Spark 3.4 or later

Review Comment:
   nit we can remove this comment



##########
spark/src/test/spark-3.4-plus/org/apache/comet/exec/CometExec3_4PlusSuite.scala:
##########
@@ -19,18 +19,37 @@
 
 package org.apache.comet.exec
 
+import java.io.ByteArrayOutputStream
+import scala.util.Random
+import org.apache.spark.sql.{Column, CometTestBase}
+import org.apache.comet.CometConf
+import org.apache.spark.sql.catalyst.FunctionIdentifier
+import org.apache.spark.sql.catalyst.expressions.{BloomFilterMightContain, 
Expression, ExpressionInfo}
+import org.apache.spark.sql.functions.{col, lit}
+import org.apache.spark.util.sketch.BloomFilter
 import org.scalactic.source.Position
 import org.scalatest.Tag
-
-import org.apache.spark.sql.CometTestBase
-import org.apache.comet.CometConf
-
 /**
  * This test suite contains tests for only Spark 3.4+.
  */
 class CometExec3_4PlusSuite extends CometTestBase {

Review Comment:
   (optional) since Spark 3.4 is the lowest version, we may be able to move 
this into the regular test location



##########
spark/src/main/spark-3.4/org/apache/comet/shims/CometExprShim.scala:
##########
@@ -33,11 +32,6 @@ trait CometExprShim {
         (unhex.child, Literal(unhex.failOnError))
     }
 
-    protected def isTimestampNTZType(dt: DataType): Boolean = dt match {
-        case _: TimestampNTZType => true
-        case _ => false
-    }
-
     protected def evalMode(c: Cast): CometEvalMode.Value =

Review Comment:
   (Optional) now `evalMode` can be a common method



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: github-unsubscr...@datafusion.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: github-unsubscr...@datafusion.apache.org
For additional commands, e-mail: github-h...@datafusion.apache.org

Reply via email to