huaxingao commented on PR #1920:
URL: 
https://github.com/apache/datafusion-comet/pull/1920#issuecomment-2997669574

   I somehow got some strange errors:
   ```
   [info] ParquetV1QuerySuite:
   [info] - simple select queries (635 milliseconds)
   [info] - appending (254 milliseconds)
   [info] - overwriting (562 milliseconds)
   [info] - self-join (258 milliseconds)
   [info] - nested data - struct with array field (406 milliseconds)
   [info] - nested data - array of struct (390 milliseconds)
   [info] - SPARK-1913 regression: columns only referenced by pushed down 
filters should remain (363 milliseconds)
   [info] - SPARK-5309 strings stored using dictionary compression in parquet 
(821 milliseconds)
   [info] - SPARK-6917 DecimalType should work with non-native types (201 
milliseconds)
   [info] - SPARK-10634 timestamp written and read as INT64 - truncation (331 
milliseconds)
   [info] - SPARK-36182, SPARK-47368: writing and reading TimestampNTZType 
column (750 milliseconds)
   17:56:39.487 ERROR org.apache.spark.executor.Executor: Exception in task 1.0 
in stage 140.0 (TID 215)
   org.apache.spark.SparkException: Encountered error while reading file 
file:///__w/datafusion-comet/datafusion-comet/apache-spark/target/tmp/spark-8a25ab9e-6dec-41e3-8354-ceb405cbd9d5/part-00001-22a1399a-9d0c-4972-bf97-e6bdffa1c9cd-c000.snappy.parquet.
 Details:
        at 
org.apache.spark.sql.errors.QueryExecutionErrors$.cannotReadFilesError(QueryExecutionErrors.scala:864)
        at 
org.apache.spark.sql.execution.datasources.FileScanRDD$$anon$1.nextIterator(FileScanRDD.scala:296)
        at 
org.apache.spark.sql.execution.datasources.FileScanRDD$$anon$1.hasNext(FileScanRDD.scala:131)
        at 
org.apache.spark.sql.comet.CometScanExec$$anon$1.hasNext(CometScanExec.scala:266)
   ```
   
   ```
   17:57:05.557 WARN 
org.apache.spark.sql.execution.datasources.parquet.ParquetV1QuerySuite: 
   
   ===== POSSIBLE THREAD LEAK IN SUITE 
o.a.s.sql.execution.datasources.parquet.ParquetV1QuerySuite, threads: 
QueryStageCreator-47 (daemon=true), QueryStageCreator-52 (daemon=true), 
QueryStageCreator-58 (daemon=true), QueryStageCreator-54 (daemon=true), 
comet-broadcast-exchange-130 (daemon=true), shuffle-boss-714-1 (daemon=true), 
QueryStageCreator-57 (daemon=true), QueryStageCreator-46 (daemon=true), 
QueryStageCreator-59 (daemon=true), rpc-boss-711-1 (daemon=true), 
QueryStageCreator-51 (daemon=true), QuerySta...
   
   [info] 
org.apache.spark.sql.execution.datasources.parquet.ParquetV1QuerySuite *** 
ABORTED *** (31 seconds, 172 milliseconds)
   [info]   The code passed to eventually never returned normally. Attempted 15 
times over 10.065526739000001 seconds. Last failure message: There are 12 
possibly leaked file streams.. (SharedSparkSession.scala:189)
   [info]   org.scalatest.exceptions.TestFailedDueToTimeoutException:
   [info]   at 
org.scalatest.enablers.Retrying$$anon$4.tryTryAgain$2(Retrying.scala:219)
   [info]   at org.scalatest.enablers.Retrying$$anon$4.retry(Retrying.scala:226)
   [info]   at 
org.scalatest.concurrent.Eventually.eventually(Eventually.scala:313)
   [info]   at 
org.scalatest.concurrent.Eventually.eventually$(Eventually.scala:312)
   [info]   at 
org.apache.spark.sql.execution.datasources.parquet.ParquetQuerySuite.eventually(ParquetQuerySuite.scala:47)
   [info]   at 
org.apache.spark.sql.test.SharedSparkSessionBase.afterEach(SharedSparkSession.scala:189)
   [info]   at 
org.apache.spark.sql.test.SharedSparkSessionBase.afterEach$(SharedSparkSession.scala:183)
   [info]   at 
org.apache.spark.sql.execution.datasources.parquet.ParquetQuerySuite.afterEach(ParquetQuerySuite.scala:47)
   [info]   at 
org.scalatest.BeforeAndAfterEach.$anonfun$runTest$1(BeforeAndAfterEach.scala:247)
   [info]   at org.scalatest.Status.$anonfun$withAfterEffect$1(Status.scala:377)
   [info]   at 
org.scalatest.Status.$anonfun$withAfterEffect$1$adapted(Status.scala:373)
   [info]   at org.scalatest.SucceededStatus$.whenCompleted(Status.scala:462)
   [info]   at org.scalatest.Status.withAfterEffect(Status.scala:373)
   [info]   at org.scalatest.Status.withAfterEffect$(Status.scala:371)
   [info]   at org.scalatest.SucceededStatus$.withAfterEffect(Status.scala:434)
   [info]   at 
org.scalatest.BeforeAndAfterEach.runTest(BeforeAndAfterEach.scala:246)
   [info]   at 
org.scalatest.BeforeAndAfterEach.runTest$(BeforeAndAfterEach.scala:227)
   ```
   I am still trying to figure out whether these errors are caused by my changes


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: github-unsubscr...@datafusion.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: github-unsubscr...@datafusion.apache.org
For additional commands, e-mail: github-h...@datafusion.apache.org

Reply via email to