Ranga Reddy created HUDI-9351:
---------------------------------

             Summary: java.io.InvalidClassException: 
org.apache.spark.sql.execution.datasources.parquet.HoodieFileGroupReaderBasedParquetFileFormat
                 Key: HUDI-9351
                 URL: https://issues.apache.org/jira/browse/HUDI-9351
             Project: Apache Hudi
          Issue Type: Bug
          Components: spark-sql
    Affects Versions: 1.0.2
            Reporter: Ranga Reddy
             Fix For: 1.1.0


Using the latest master of Hudi (1.0.2), I am unable to query the table data 
created with Hudi 1.0.0.
{code:java}
Driver stacktrace:
org.apache.spark.SparkException: Job aborted due to stage failure: Task 1 in 
stage 0.0 failed 4 times, most recent failure: Lost task 1.3 in stage 0.0 (TID 
6) (172.19.0.4 executor 0): java.io.InvalidClassException: 
org.apache.spark.sql.execution.datasources.parquet.HoodieFileGroupReaderBasedParquetFileFormat;
 local class incompatible: stream classdesc serialVersionUID = 
2474481021876753867, local class serialVersionUID = 4655759304258465903
    at 
java.base/java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:560)
    at 
java.base/java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:2020)
    at 
java.base/java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1870)
    at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2201)
    at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1687)
    at 
java.base/java.io.ObjectInputStream.readArray(ObjectInputStream.java:2134)
    at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1675)
    at 
java.base/java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2496)
    at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2390)
    at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2228)
    at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1687)
    at 
java.base/java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2496)
    at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2390)
    at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2228)
    at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1687)
    at 
java.base/java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2496)
    at 
java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2390)
    at 
java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2228)
    at 
java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1687)
    at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:489)
    at 
java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:447)
......
    at 
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:87)
    at 
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:129)
    at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:86)
    at org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:166)
    at org.apache.spark.scheduler.Task.run(Task.scala:141)
    at 
org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:620)
    at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64)
    at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61)
    at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:94)
    at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:623)
    at 
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
    at 
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
    at java.base/java.lang.Thread.run(Thread.java:829)
{code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

Reply via email to