robbik commented on issue #12931:
URL: https://github.com/apache/hudi/issues/12931#issuecomment-2707971955

   and here is the full stack trace of error while doing upsert.
   
   ```
   10:07:54,937 ERROR 
[org.apache.hudi.common.table.log.AbstractHoodieLogRecordScanner] Got exception 
when reading log file
   java.lang.IllegalStateException: The current lookup key is less than the 
current position of the cursor, i.e., backward seekTo, which is not supported 
and should be avoided. 
key=UTF8StringKey{OurBranchID:0324,AccountID:0324NT700001,LoanSeries:1,InstallmentNo:30}
 cursor=HFilePosition{offset=3485859, 
keyValue=Option{val=KeyValue{key=Key{OurBranchID:0324,AccountID:0324REG00002,LoanSeries:1,InstallmentNo:5}}}}
        at 
org.apache.hudi.io.hfile.HFileReaderImpl.seekTo(HFileReaderImpl.java:173) 
~[hudi-spark3.5-bundle_2.12-1.0.1.jar:1.0.1]
        at 
org.apache.hudi.io.storage.HoodieNativeAvroHFileReader$RecordByKeyIterator.hasNext(HoodieNativeAvroHFileReader.java:404)
 ~[hudi-spark3.5-bundle_2.12-1.0.1.jar:1.0.1]
        at 
org.apache.hudi.common.util.collection.MappingIterator.hasNext(MappingIterator.java:39)
 ~[hudi-spark3.5-bundle_2.12-1.0.1.jar:1.0.1]
        at 
org.apache.hudi.common.util.collection.MappingIterator.hasNext(MappingIterator.java:39)
 ~[hudi-spark3.5-bundle_2.12-1.0.1.jar:1.0.1]
        at 
org.apache.hudi.common.util.collection.MappingIterator.hasNext(MappingIterator.java:39)
 ~[hudi-spark3.5-bundle_2.12-1.0.1.jar:1.0.1]
        at 
org.apache.hudi.common.table.log.AbstractHoodieLogRecordScanner.processDataBlock(AbstractHoodieLogRecordScanner.java:633)
 ~[hudi-spark3.5-bundle_2.12-1.0.1.jar:1.0.1]
        at 
org.apache.hudi.common.table.log.AbstractHoodieLogRecordScanner.processQueuedBlocksForInstant(AbstractHoodieLogRecordScanner.java:675)
 ~[hudi-spark3.5-bundle_2.12-1.0.1.jar:1.0.1]
        at 
org.apache.hudi.common.table.log.AbstractHoodieLogRecordScanner.scanInternalV2(AbstractHoodieLogRecordScanner.java:595)
 ~[hudi-spark3.5-bundle_2.12-1.0.1.jar:1.0.1]
        at 
org.apache.hudi.common.table.log.AbstractHoodieLogRecordScanner.scanInternal(AbstractHoodieLogRecordScanner.java:248)
 ~[hudi-spark3.5-bundle_2.12-1.0.1.jar:1.0.1]
        at 
org.apache.hudi.common.table.log.HoodieMergedLogRecordScanner.scanByFullKeys(HoodieMergedLogRecordScanner.java:163)
 ~[hudi-spark3.5-bundle_2.12-1.0.1.jar:1.0.1]
        at 
org.apache.hudi.metadata.HoodieMetadataLogRecordReader.getAllRecordsByKeys(HoodieMetadataLogRecordReader.java:127)
 ~[hudi-spark3.5-bundle_2.12-1.0.1.jar:1.0.1]
        at 
org.apache.hudi.metadata.HoodieBackedTableMetadata.readAllLogRecords(HoodieBackedTableMetadata.java:477)
 ~[hudi-spark3.5-bundle_2.12-1.0.1.jar:1.0.1]
        at 
org.apache.hudi.metadata.HoodieBackedTableMetadata.lookupAllKeysFromFileSlice(HoodieBackedTableMetadata.java:455)
 ~[hudi-spark3.5-bundle_2.12-1.0.1.jar:1.0.1]
        at 
org.apache.hudi.metadata.HoodieBackedTableMetadata.lambda$getAllRecordsByKeys$61b9eeed$1(HoodieBackedTableMetadata.java:327)
 ~[hudi-spark3.5-bundle_2.12-1.0.1.jar:1.0.1]
        at 
org.apache.hudi.common.function.FunctionWrapper.lambda$throwingMapWrapper$0(FunctionWrapper.java:38)
 ~[hudi-spark3.5-bundle_2.12-1.0.1.jar:1.0.1]
        at 
java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:197) [?:?]
        at 
java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1625) 
[?:?]
        at 
java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:509) [?:?]
        at 
java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:499) 
[?:?]
        at java.util.stream.ReduceOps$ReduceTask.doLeaf(ReduceOps.java:960) 
[?:?]
        at java.util.stream.ReduceOps$ReduceTask.doLeaf(ReduceOps.java:934) 
[?:?]
        at java.util.stream.AbstractTask.compute(AbstractTask.java:327) [?:?]
        at 
java.util.concurrent.CountedCompleter.exec(CountedCompleter.java:754) [?:?]
        at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:373) [?:?]
        at 
java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1182)
 [?:?]
        at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1655) [?:?]
        at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1622) 
[?:?]
        at 
java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:165) 
[?:?]
   10:07:55,000 ERROR [org.apache.spark.executor.Executor] Exception in task 
28.0 in stage 2.0 (TID 553)
   org.apache.hudi.exception.HoodieException: 
org.apache.hudi.exception.HoodieException: Error occurs when executing map
        at 
jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) 
~[?:?]
        at 
jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:77)
 ~[?:?]
        at 
jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
 ~[?:?]
        at 
java.lang.reflect.Constructor.newInstanceWithCaller(Constructor.java:499) ~[?:?]
        at java.lang.reflect.Constructor.newInstance(Constructor.java:480) 
~[?:?]
        at 
java.util.concurrent.ForkJoinTask.getThrowableException(ForkJoinTask.java:562) 
~[?:?]
        at 
java.util.concurrent.ForkJoinTask.reportException(ForkJoinTask.java:591) ~[?:?]
        at java.util.concurrent.ForkJoinTask.invoke(ForkJoinTask.java:689) 
~[?:?]
        at 
java.util.stream.ReduceOps$ReduceOp.evaluateParallel(ReduceOps.java:927) ~[?:?]
        at 
java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:233) ~[?:?]
        at 
java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:682) ~[?:?]
        at 
org.apache.hudi.common.engine.HoodieLocalEngineContext.map(HoodieLocalEngineContext.java:90)
 ~[hudi-spark3.5-bundle_2.12-1.0.1.jar:1.0.1]
        at 
org.apache.hudi.metadata.HoodieBackedTableMetadata.getAllRecordsByKeys(HoodieBackedTableMetadata.java:322)
 ~[hudi-spark3.5-bundle_2.12-1.0.1.jar:1.0.1]
        at 
org.apache.hudi.metadata.BaseTableMetadata.readRecordIndex(BaseTableMetadata.java:299)
 ~[hudi-spark3.5-bundle_2.12-1.0.1.jar:1.0.1]
        at 
org.apache.hudi.index.SparkMetadataTableRecordIndex$RecordIndexFileGroupLookupFunction.call(SparkMetadataTableRecordIndex.java:166)
 ~[hudi-spark3.5-bundle_2.12-1.0.1.jar:1.0.1]
        at 
org.apache.hudi.index.SparkMetadataTableRecordIndex$RecordIndexFileGroupLookupFunction.call(SparkMetadataTableRecordIndex.java:153)
 ~[hudi-spark3.5-bundle_2.12-1.0.1.jar:1.0.1]
        at 
org.apache.spark.api.java.JavaRDDLike.$anonfun$mapPartitionsToPair$1(JavaRDDLike.scala:186)
 ~[spark-core_2.12-3.5.2.jar:3.5.2]
        at org.apache.spark.rdd.RDD.$anonfun$mapPartitions$2(RDD.scala:858) 
~[spark-core_2.12-3.5.2.jar:3.5.2]
        at 
org.apache.spark.rdd.RDD.$anonfun$mapPartitions$2$adapted(RDD.scala:858) 
~[spark-core_2.12-3.5.2.jar:3.5.2]
        at 
org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52) 
~[spark-core_2.12-3.5.2.jar:3.5.2]
        at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) 
~[spark-core_2.12-3.5.2.jar:3.5.2]
        at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) 
~[spark-core_2.12-3.5.2.jar:3.5.2]
        at 
org.apache.spark.shuffle.ShuffleWriteProcessor.write(ShuffleWriteProcessor.scala:59)
 ~[spark-core_2.12-3.5.2.jar:3.5.2]
        at 
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:104) 
~[spark-core_2.12-3.5.2.jar:3.5.2]
        at 
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:54) 
~[spark-core_2.12-3.5.2.jar:3.5.2]
        at 
org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:166) 
~[spark-core_2.12-3.5.2.jar:3.5.2]
        at org.apache.spark.scheduler.Task.run(Task.scala:141) 
~[spark-core_2.12-3.5.2.jar:3.5.2]
        at 
org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:620)
 ~[spark-core_2.12-3.5.2.jar:3.5.2]
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64)
 ~[spark-common-utils_2.12-3.5.2.jar:3.5.2]
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61)
 ~[spark-common-utils_2.12-3.5.2.jar:3.5.2]
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:94) 
~[spark-core_2.12-3.5.2.jar:3.5.2]
        at 
org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:623) 
[spark-core_2.12-3.5.2.jar:3.5.2]
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) 
[?:?]
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) 
[?:?]
        at java.lang.Thread.run(Thread.java:833) [?:?]
   Caused by: org.apache.hudi.exception.HoodieException: Error occurs when 
executing map
        at 
org.apache.hudi.common.function.FunctionWrapper.lambda$throwingMapWrapper$0(FunctionWrapper.java:40)
 ~[hudi-spark3.5-bundle_2.12-1.0.1.jar:1.0.1]
        at 
java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:197) ~[?:?]
        at 
java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1625) 
~[?:?]
        at 
java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:509) ~[?:?]
        at 
java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:499) 
~[?:?]
        at java.util.stream.ReduceOps$ReduceTask.doLeaf(ReduceOps.java:960) 
~[?:?]
        at java.util.stream.ReduceOps$ReduceTask.doLeaf(ReduceOps.java:934) 
~[?:?]
        at java.util.stream.AbstractTask.compute(AbstractTask.java:327) ~[?:?]
        at 
java.util.concurrent.CountedCompleter.exec(CountedCompleter.java:754) ~[?:?]
        at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:373) 
~[?:?]
        at 
java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1182)
 ~[?:?]
        at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1655) ~[?:?]
        at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1622) 
~[?:?]
        at 
java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:165) 
~[?:?]
   Caused by: org.apache.hudi.exception.HoodieException: Exception when reading 
log file 
        at 
org.apache.hudi.common.table.log.AbstractHoodieLogRecordScanner.scanInternalV2(AbstractHoodieLogRecordScanner.java:604)
 ~[hudi-spark3.5-bundle_2.12-1.0.1.jar:1.0.1]
        at 
org.apache.hudi.common.table.log.AbstractHoodieLogRecordScanner.scanInternal(AbstractHoodieLogRecordScanner.java:248)
 ~[hudi-spark3.5-bundle_2.12-1.0.1.jar:1.0.1]
        at 
org.apache.hudi.common.table.log.HoodieMergedLogRecordScanner.scanByFullKeys(HoodieMergedLogRecordScanner.java:163)
 ~[hudi-spark3.5-bundle_2.12-1.0.1.jar:1.0.1]
        at 
org.apache.hudi.metadata.HoodieMetadataLogRecordReader.getAllRecordsByKeys(HoodieMetadataLogRecordReader.java:127)
 ~[hudi-spark3.5-bundle_2.12-1.0.1.jar:1.0.1]
        at 
org.apache.hudi.metadata.HoodieBackedTableMetadata.readAllLogRecords(HoodieBackedTableMetadata.java:477)
 ~[hudi-spark3.5-bundle_2.12-1.0.1.jar:1.0.1]
        at 
org.apache.hudi.metadata.HoodieBackedTableMetadata.lookupAllKeysFromFileSlice(HoodieBackedTableMetadata.java:455)
 ~[hudi-spark3.5-bundle_2.12-1.0.1.jar:1.0.1]
        at 
org.apache.hudi.metadata.HoodieBackedTableMetadata.lambda$getAllRecordsByKeys$61b9eeed$1(HoodieBackedTableMetadata.java:327)
 ~[hudi-spark3.5-bundle_2.12-1.0.1.jar:1.0.1]
        at 
org.apache.hudi.common.function.FunctionWrapper.lambda$throwingMapWrapper$0(FunctionWrapper.java:38)
 ~[hudi-spark3.5-bundle_2.12-1.0.1.jar:1.0.1]
        at 
java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:197) ~[?:?]
        at 
java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1625) 
~[?:?]
        at 
java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:509) ~[?:?]
        at 
java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:499) 
~[?:?]
        at java.util.stream.ReduceOps$ReduceTask.doLeaf(ReduceOps.java:960) 
~[?:?]
        at java.util.stream.ReduceOps$ReduceTask.doLeaf(ReduceOps.java:934) 
~[?:?]
        at java.util.stream.AbstractTask.compute(AbstractTask.java:327) ~[?:?]
        at 
java.util.concurrent.CountedCompleter.exec(CountedCompleter.java:754) ~[?:?]
        at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:373) 
~[?:?]
        at 
java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1182)
 ~[?:?]
        at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1655) ~[?:?]
        at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1622) 
~[?:?]
        at 
java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:165) 
~[?:?]
   Caused by: java.lang.IllegalStateException: The current lookup key is less 
than the current position of the cursor, i.e., backward seekTo, which is not 
supported and should be avoided. 
key=UTF8StringKey{OurBranchID:0324,AccountID:0324NT700001,LoanSeries:1,InstallmentNo:30}
 cursor=HFilePosition{offset=3485859, 
keyValue=Option{val=KeyValue{key=Key{OurBranchID:0324,AccountID:0324REG00002,LoanSeries:1,InstallmentNo:5}}}}
        at 
org.apache.hudi.io.hfile.HFileReaderImpl.seekTo(HFileReaderImpl.java:173) 
~[hudi-spark3.5-bundle_2.12-1.0.1.jar:1.0.1]
        at 
org.apache.hudi.io.storage.HoodieNativeAvroHFileReader$RecordByKeyIterator.hasNext(HoodieNativeAvroHFileReader.java:404)
 ~[hudi-spark3.5-bundle_2.12-1.0.1.jar:1.0.1]
        at 
org.apache.hudi.common.util.collection.MappingIterator.hasNext(MappingIterator.java:39)
 ~[hudi-spark3.5-bundle_2.12-1.0.1.jar:1.0.1]
        at 
org.apache.hudi.common.util.collection.MappingIterator.hasNext(MappingIterator.java:39)
 ~[hudi-spark3.5-bundle_2.12-1.0.1.jar:1.0.1]
        at 
org.apache.hudi.common.util.collection.MappingIterator.hasNext(MappingIterator.java:39)
 ~[hudi-spark3.5-bundle_2.12-1.0.1.jar:1.0.1]
        at 
org.apache.hudi.common.table.log.AbstractHoodieLogRecordScanner.processDataBlock(AbstractHoodieLogRecordScanner.java:633)
 ~[hudi-spark3.5-bundle_2.12-1.0.1.jar:1.0.1]
        at 
org.apache.hudi.common.table.log.AbstractHoodieLogRecordScanner.processQueuedBlocksForInstant(AbstractHoodieLogRecordScanner.java:675)
 ~[hudi-spark3.5-bundle_2.12-1.0.1.jar:1.0.1]
        at 
org.apache.hudi.common.table.log.AbstractHoodieLogRecordScanner.scanInternalV2(AbstractHoodieLogRecordScanner.java:595)
 ~[hudi-spark3.5-bundle_2.12-1.0.1.jar:1.0.1]
        at 
org.apache.hudi.common.table.log.AbstractHoodieLogRecordScanner.scanInternal(AbstractHoodieLogRecordScanner.java:248)
 ~[hudi-spark3.5-bundle_2.12-1.0.1.jar:1.0.1]
        at 
org.apache.hudi.common.table.log.HoodieMergedLogRecordScanner.scanByFullKeys(HoodieMergedLogRecordScanner.java:163)
 ~[hudi-spark3.5-bundle_2.12-1.0.1.jar:1.0.1]
        at 
org.apache.hudi.metadata.HoodieMetadataLogRecordReader.getAllRecordsByKeys(HoodieMetadataLogRecordReader.java:127)
 ~[hudi-spark3.5-bundle_2.12-1.0.1.jar:1.0.1]
        at 
org.apache.hudi.metadata.HoodieBackedTableMetadata.readAllLogRecords(HoodieBackedTableMetadata.java:477)
 ~[hudi-spark3.5-bundle_2.12-1.0.1.jar:1.0.1]
        at 
org.apache.hudi.metadata.HoodieBackedTableMetadata.lookupAllKeysFromFileSlice(HoodieBackedTableMetadata.java:455)
 ~[hudi-spark3.5-bundle_2.12-1.0.1.jar:1.0.1]
        at 
org.apache.hudi.metadata.HoodieBackedTableMetadata.lambda$getAllRecordsByKeys$61b9eeed$1(HoodieBackedTableMetadata.java:327)
 ~[hudi-spark3.5-bundle_2.12-1.0.1.jar:1.0.1]
        at 
org.apache.hudi.common.function.FunctionWrapper.lambda$throwingMapWrapper$0(FunctionWrapper.java:38)
 ~[hudi-spark3.5-bundle_2.12-1.0.1.jar:1.0.1]
        at 
java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:197) ~[?:?]
        at 
java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1625) 
~[?:?]
        at 
java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:509) ~[?:?]
        at 
java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:499) 
~[?:?]
        at java.util.stream.ReduceOps$ReduceTask.doLeaf(ReduceOps.java:960) 
~[?:?]
        at java.util.stream.ReduceOps$ReduceTask.doLeaf(ReduceOps.java:934) 
~[?:?]
        at java.util.stream.AbstractTask.compute(AbstractTask.java:327) ~[?:?]
        at 
java.util.concurrent.CountedCompleter.exec(CountedCompleter.java:754) ~[?:?]
        at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:373) 
~[?:?]
        at 
java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1182)
 ~[?:?]
        at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1655) ~[?:?]
        at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1622) 
~[?:?]
        at 
java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:165) 
~[?:?]
   ```
   
   ```
   org.apache.hudi.exception.HoodieUpsertException: Failed to upsert for commit 
time 20250308100148957
        at 
org.apache.hudi.table.action.commit.BaseWriteHelper.write(BaseWriteHelper.java:64)
 ~[hudi-spark3.5-bundle_2.12-1.0.1.jar:1.0.1]
        at 
org.apache.hudi.table.action.deltacommit.SparkUpsertDeltaCommitActionExecutor.execute(SparkUpsertDeltaCommitActionExecutor.java:45)
 ~[hudi-spark3.5-bundle_2.12-1.0.1.jar:1.0.1]
        at 
org.apache.hudi.table.HoodieSparkMergeOnReadTable.upsert(HoodieSparkMergeOnReadTable.java:98)
 ~[hudi-spark3.5-bundle_2.12-1.0.1.jar:1.0.1]
        at 
org.apache.hudi.table.HoodieSparkMergeOnReadTable.upsert(HoodieSparkMergeOnReadTable.java:88)
 ~[hudi-spark3.5-bundle_2.12-1.0.1.jar:1.0.1]
        at 
org.apache.hudi.client.SparkRDDWriteClient.upsert(SparkRDDWriteClient.java:132) 
~[hudi-spark3.5-bundle_2.12-1.0.1.jar:1.0.1]
        at 
org.apache.hudi.DataSourceUtils.doWriteOperation(DataSourceUtils.java:227) 
~[hudi-spark3.5-bundle_2.12-1.0.1.jar:1.0.1]
        at 
org.apache.hudi.HoodieSparkSqlWriterInternal.liftedTree1$1(HoodieSparkSqlWriter.scala:517)
 ~[hudi-spark3.5-bundle_2.12-1.0.1.jar:1.0.1]
        at 
org.apache.hudi.HoodieSparkSqlWriterInternal.writeInternal(HoodieSparkSqlWriter.scala:515)
 ~[hudi-spark3.5-bundle_2.12-1.0.1.jar:1.0.1]
        at 
org.apache.hudi.HoodieSparkSqlWriterInternal.$anonfun$write$1(HoodieSparkSqlWriter.scala:192)
 ~[hudi-spark3.5-bundle_2.12-1.0.1.jar:1.0.1]
        at 
org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$6(SQLExecution.scala:125)
 ~[spark-sql_2.12-3.5.2.jar:3.5.2]
        at 
org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:201)
 ~[spark-sql_2.12-3.5.2.jar:3.5.2]
        at 
org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:108)
 ~[spark-sql_2.12-3.5.2.jar:3.5.2]
        at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:900) 
~[spark-sql_2.12-3.5.2.jar:3.5.2]
        at 
org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:66)
 ~[spark-sql_2.12-3.5.2.jar:3.5.2]
        at 
org.apache.spark.sql.adapter.BaseSpark3Adapter.sqlExecutionWithNewExecutionId(BaseSpark3Adapter.scala:105)
 ~[hudi-spark3.5-bundle_2.12-1.0.1.jar:1.0.1]
        at 
org.apache.hudi.HoodieSparkSqlWriterInternal.write(HoodieSparkSqlWriter.scala:214)
 ~[hudi-spark3.5-bundle_2.12-1.0.1.jar:1.0.1]
        at 
org.apache.hudi.HoodieSparkSqlWriter$.write(HoodieSparkSqlWriter.scala:129) 
~[hudi-spark3.5-bundle_2.12-1.0.1.jar:1.0.1]
        at 
org.apache.hudi.DefaultSource.createRelation(DefaultSource.scala:170) 
~[hudi-spark3.5-bundle_2.12-1.0.1.jar:1.0.1]
        at 
org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:48)
 ~[spark-sql_2.12-3.5.2.jar:3.5.2]
        at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:75)
 ~[spark-sql_2.12-3.5.2.jar:3.5.2]
        at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:73)
 ~[spark-sql_2.12-3.5.2.jar:3.5.2]
        at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:84)
 ~[spark-sql_2.12-3.5.2.jar:3.5.2]
        at 
org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.$anonfun$applyOrElse$1(QueryExecution.scala:107)
 ~[spark-sql_2.12-3.5.2.jar:3.5.2]
        at 
org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$6(SQLExecution.scala:125)
 ~[spark-sql_2.12-3.5.2.jar:3.5.2]
        at 
org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:201)
 ~[spark-sql_2.12-3.5.2.jar:3.5.2]
        at 
org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:108)
 ~[spark-sql_2.12-3.5.2.jar:3.5.2]
        at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:900) 
~[spark-sql_2.12-3.5.2.jar:3.5.2]
        at 
org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:66)
 ~[spark-sql_2.12-3.5.2.jar:3.5.2]
        at 
org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.applyOrElse(QueryExecution.scala:107)
 ~[spark-sql_2.12-3.5.2.jar:3.5.2]
        at 
org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.applyOrElse(QueryExecution.scala:98)
 ~[spark-sql_2.12-3.5.2.jar:3.5.2]
        at 
org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:461)
 ~[spark-catalyst_2.12-3.5.2.jar:3.5.2]
        at 
org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(origin.scala:76) 
~[spark-sql-api_2.12-3.5.2.jar:3.5.2]
        at 
org.apache.spark.sql.catalyst.trees.TreeNode.transformDownWithPruning(TreeNode.scala:461)
 ~[spark-catalyst_2.12-3.5.2.jar:3.5.2]
        at 
org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.org$apache$spark$sql$catalyst$plans$logical$AnalysisHelper$$super$transformDownWithPruning(LogicalPlan.scala:32)
 ~[spark-catalyst_2.12-3.5.2.jar:3.5.2]
        at 
org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning(AnalysisHelper.scala:267)
 ~[spark-catalyst_2.12-3.5.2.jar:3.5.2]
        at 
org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning$(AnalysisHelper.scala:263)
 ~[spark-catalyst_2.12-3.5.2.jar:3.5.2]
        at 
org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:32)
 ~[spark-catalyst_2.12-3.5.2.jar:3.5.2]
        at 
org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:32)
 ~[spark-catalyst_2.12-3.5.2.jar:3.5.2]
        at 
org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:437) 
~[spark-catalyst_2.12-3.5.2.jar:3.5.2]
        at 
org.apache.spark.sql.execution.QueryExecution.eagerlyExecuteCommands(QueryExecution.scala:98)
 ~[spark-sql_2.12-3.5.2.jar:3.5.2]
        at 
org.apache.spark.sql.execution.QueryExecution.commandExecuted$lzycompute(QueryExecution.scala:85)
 ~[spark-sql_2.12-3.5.2.jar:3.5.2]
        at 
org.apache.spark.sql.execution.QueryExecution.commandExecuted(QueryExecution.scala:83)
 ~[spark-sql_2.12-3.5.2.jar:3.5.2]
        at 
org.apache.spark.sql.execution.QueryExecution.assertCommandExecuted(QueryExecution.scala:142)
 ~[spark-sql_2.12-3.5.2.jar:3.5.2]
        at 
org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:859) 
~[spark-sql_2.12-3.5.2.jar:3.5.2]
        at 
org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:388) 
~[spark-sql_2.12-3.5.2.jar:3.5.2]
        at 
org.apache.spark.sql.DataFrameWriter.saveInternal(DataFrameWriter.scala:361) 
~[spark-sql_2.12-3.5.2.jar:3.5.2]
        at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:248) 
~[spark-sql_2.12-3.5.2.jar:3.5.2]
           ...
   Caused by: org.apache.spark.SparkException: Job aborted due to stage 
failure: Task 28 in stage 2.0 failed 1 times, most recent failure: Lost task 
28.0 in stage 2.0 (TID 553) (10.255.255.254 executor driver): 
org.apache.hudi.exception.HoodieException: 
org.apache.hudi.exception.HoodieException: Error occurs when executing map
        at 
java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native
 Method)
        at 
java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:77)
        at 
java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at 
java.base/java.lang.reflect.Constructor.newInstanceWithCaller(Constructor.java:499)
        at 
java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:480)
        at 
java.base/java.util.concurrent.ForkJoinTask.getThrowableException(ForkJoinTask.java:562)
        at 
java.base/java.util.concurrent.ForkJoinTask.reportException(ForkJoinTask.java:591)
        at 
java.base/java.util.concurrent.ForkJoinTask.invoke(ForkJoinTask.java:689)
        at 
java.base/java.util.stream.ReduceOps$ReduceOp.evaluateParallel(ReduceOps.java:927)
        at 
java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:233)
        at 
java.base/java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:682)
        at 
org.apache.hudi.common.engine.HoodieLocalEngineContext.map(HoodieLocalEngineContext.java:90)
        at 
org.apache.hudi.metadata.HoodieBackedTableMetadata.getAllRecordsByKeys(HoodieBackedTableMetadata.java:322)
        at 
org.apache.hudi.metadata.BaseTableMetadata.readRecordIndex(BaseTableMetadata.java:299)
        at 
org.apache.hudi.index.SparkMetadataTableRecordIndex$RecordIndexFileGroupLookupFunction.call(SparkMetadataTableRecordIndex.java:166)
        at 
org.apache.hudi.index.SparkMetadataTableRecordIndex$RecordIndexFileGroupLookupFunction.call(SparkMetadataTableRecordIndex.java:153)
        at 
org.apache.spark.api.java.JavaRDDLike.$anonfun$mapPartitionsToPair$1(JavaRDDLike.scala:186)
        at org.apache.spark.rdd.RDD.$anonfun$mapPartitions$2(RDD.scala:858)
        at 
org.apache.spark.rdd.RDD.$anonfun$mapPartitions$2$adapted(RDD.scala:858)
        at 
org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
        at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367)
        at org.apache.spark.rdd.RDD.iterator(RDD.scala:331)
        at 
org.apache.spark.shuffle.ShuffleWriteProcessor.write(ShuffleWriteProcessor.scala:59)
        at 
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:104)
        at 
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:54)
        at 
org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:166)
        at org.apache.spark.scheduler.Task.run(Task.scala:141)
        at 
org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:620)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:94)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:623)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
        at java.base/java.lang.Thread.run(Thread.java:833)
   Caused by: org.apache.hudi.exception.HoodieException: Error occurs when 
executing map
        at 
org.apache.hudi.common.function.FunctionWrapper.lambda$throwingMapWrapper$0(FunctionWrapper.java:40)
        at 
java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:197)
        at 
java.base/java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1625)
        at 
java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:509)
        at 
java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:499)
        at 
java.base/java.util.stream.ReduceOps$ReduceTask.doLeaf(ReduceOps.java:960)
        at 
java.base/java.util.stream.ReduceOps$ReduceTask.doLeaf(ReduceOps.java:934)
        at 
java.base/java.util.stream.AbstractTask.compute(AbstractTask.java:327)
        at 
java.base/java.util.concurrent.CountedCompleter.exec(CountedCompleter.java:754)
        at 
java.base/java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:373)
        at 
java.base/java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1182)
        at 
java.base/java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1655)
        at 
java.base/java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1622)
        at 
java.base/java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:165)
   Caused by: org.apache.hudi.exception.HoodieException: Exception when reading 
log file 
        at 
org.apache.hudi.common.table.log.AbstractHoodieLogRecordScanner.scanInternalV2(AbstractHoodieLogRecordScanner.java:604)
        at 
org.apache.hudi.common.table.log.AbstractHoodieLogRecordScanner.scanInternal(AbstractHoodieLogRecordScanner.java:248)
        at 
org.apache.hudi.common.table.log.HoodieMergedLogRecordScanner.scanByFullKeys(HoodieMergedLogRecordScanner.java:163)
        at 
org.apache.hudi.metadata.HoodieMetadataLogRecordReader.getAllRecordsByKeys(HoodieMetadataLogRecordReader.java:127)
        at 
org.apache.hudi.metadata.HoodieBackedTableMetadata.readAllLogRecords(HoodieBackedTableMetadata.java:477)
        at 
org.apache.hudi.metadata.HoodieBackedTableMetadata.lookupAllKeysFromFileSlice(HoodieBackedTableMetadata.java:455)
        at 
org.apache.hudi.metadata.HoodieBackedTableMetadata.lambda$getAllRecordsByKeys$61b9eeed$1(HoodieBackedTableMetadata.java:327)
        at 
org.apache.hudi.common.function.FunctionWrapper.lambda$throwingMapWrapper$0(FunctionWrapper.java:38)
        ... 13 more
   Caused by: java.lang.IllegalStateException: The current lookup key is less 
than the current position of the cursor, i.e., backward seekTo, which is not 
supported and should be avoided. 
key=UTF8StringKey{OurBranchID:0324,AccountID:0324NT700001,LoanSeries:1,InstallmentNo:30}
 cursor=HFilePosition{offset=3485859, 
keyValue=Option{val=KeyValue{key=Key{OurBranchID:0324,AccountID:0324REG00002,LoanSeries:1,InstallmentNo:5}}}}
        at 
org.apache.hudi.io.hfile.HFileReaderImpl.seekTo(HFileReaderImpl.java:173)
        at 
org.apache.hudi.io.storage.HoodieNativeAvroHFileReader$RecordByKeyIterator.hasNext(HoodieNativeAvroHFileReader.java:404)
        at 
org.apache.hudi.common.util.collection.MappingIterator.hasNext(MappingIterator.java:39)
        at 
org.apache.hudi.common.util.collection.MappingIterator.hasNext(MappingIterator.java:39)
        at 
org.apache.hudi.common.util.collection.MappingIterator.hasNext(MappingIterator.java:39)
        at 
org.apache.hudi.common.table.log.AbstractHoodieLogRecordScanner.processDataBlock(AbstractHoodieLogRecordScanner.java:633)
        at 
org.apache.hudi.common.table.log.AbstractHoodieLogRecordScanner.processQueuedBlocksForInstant(AbstractHoodieLogRecordScanner.java:675)
        at 
org.apache.hudi.common.table.log.AbstractHoodieLogRecordScanner.scanInternalV2(AbstractHoodieLogRecordScanner.java:595)
        ... 20 more
   ```
   
   and the driver stack trace:
   
   ```
   Driver stacktrace:
        at 
org.apache.spark.scheduler.DAGScheduler.failJobAndIndependentStages(DAGScheduler.scala:2856)
 ~[spark-core_2.12-3.5.2.jar:3.5.2]
        at 
org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2(DAGScheduler.scala:2792)
 ~[spark-core_2.12-3.5.2.jar:3.5.2]
        at 
org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2$adapted(DAGScheduler.scala:2791)
 ~[spark-core_2.12-3.5.2.jar:3.5.2]
        at 
scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62) 
~[scala-library-2.12.18.jar:?]
        at 
scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55) 
~[scala-library-2.12.18.jar:?]
        at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49) 
~[scala-library-2.12.18.jar:?]
        at 
org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:2791) 
~[spark-core_2.12-3.5.2.jar:3.5.2]
        at 
org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1(DAGScheduler.scala:1247)
 ~[spark-core_2.12-3.5.2.jar:3.5.2]
        at 
org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1$adapted(DAGScheduler.scala:1247)
 ~[spark-core_2.12-3.5.2.jar:3.5.2]
        at scala.Option.foreach(Option.scala:407) ~[scala-library-2.12.18.jar:?]
        at 
org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:1247)
 ~[spark-core_2.12-3.5.2.jar:3.5.2]
        at 
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:3060)
 ~[spark-core_2.12-3.5.2.jar:3.5.2]
        at 
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2994)
 ~[spark-core_2.12-3.5.2.jar:3.5.2]
        at 
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2983)
 ~[spark-core_2.12-3.5.2.jar:3.5.2]
        at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:49) 
~[spark-core_2.12-3.5.2.jar:3.5.2]
        at 
org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:989) 
~[spark-core_2.12-3.5.2.jar:3.5.2]
        at org.apache.spark.SparkContext.runJob(SparkContext.scala:2393) 
~[spark-core_2.12-3.5.2.jar:3.5.2]
        at org.apache.spark.SparkContext.runJob(SparkContext.scala:2414) 
~[spark-core_2.12-3.5.2.jar:3.5.2]
        at org.apache.spark.SparkContext.runJob(SparkContext.scala:2433) 
~[spark-core_2.12-3.5.2.jar:3.5.2]
        at org.apache.spark.SparkContext.runJob(SparkContext.scala:2458) 
~[spark-core_2.12-3.5.2.jar:3.5.2]
        at org.apache.spark.rdd.RDD.$anonfun$collect$1(RDD.scala:1049) 
~[spark-core_2.12-3.5.2.jar:3.5.2]
        at 
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) 
~[spark-core_2.12-3.5.2.jar:3.5.2]
        at 
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112) 
~[spark-core_2.12-3.5.2.jar:3.5.2]
        at org.apache.spark.rdd.RDD.withScope(RDD.scala:410) 
~[spark-core_2.12-3.5.2.jar:3.5.2]
        at org.apache.spark.rdd.RDD.collect(RDD.scala:1048) 
~[spark-core_2.12-3.5.2.jar:3.5.2]
        at 
org.apache.spark.rdd.PairRDDFunctions.$anonfun$countByKey$1(PairRDDFunctions.scala:367)
 ~[spark-core_2.12-3.5.2.jar:3.5.2]
        at 
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) 
~[spark-core_2.12-3.5.2.jar:3.5.2]
        at 
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112) 
~[spark-core_2.12-3.5.2.jar:3.5.2]
        at org.apache.spark.rdd.RDD.withScope(RDD.scala:410) 
~[spark-core_2.12-3.5.2.jar:3.5.2]
        at 
org.apache.spark.rdd.PairRDDFunctions.countByKey(PairRDDFunctions.scala:367) 
~[spark-core_2.12-3.5.2.jar:3.5.2]
        at 
org.apache.spark.api.java.JavaPairRDD.countByKey(JavaPairRDD.scala:314) 
~[spark-core_2.12-3.5.2.jar:3.5.2]
        at 
org.apache.hudi.data.HoodieJavaPairRDD.countByKey(HoodieJavaPairRDD.java:109) 
~[hudi-spark3.5-bundle_2.12-1.0.1.jar:1.0.1]
        at 
org.apache.hudi.table.action.commit.BaseSparkCommitActionExecutor.buildProfile(BaseSparkCommitActionExecutor.java:195)
 ~[hudi-spark3.5-bundle_2.12-1.0.1.jar:1.0.1]
        at 
org.apache.hudi.table.action.commit.BaseSparkCommitActionExecutor.execute(BaseSparkCommitActionExecutor.java:166)
 ~[hudi-spark3.5-bundle_2.12-1.0.1.jar:1.0.1]
        at 
org.apache.hudi.table.action.commit.BaseSparkCommitActionExecutor.execute(BaseSparkCommitActionExecutor.java:83)
 ~[hudi-spark3.5-bundle_2.12-1.0.1.jar:1.0.1]
        at 
org.apache.hudi.table.action.commit.BaseWriteHelper.write(BaseWriteHelper.java:58)
 ~[hudi-spark3.5-bundle_2.12-1.0.1.jar:1.0.1]
        ... 53 more
   Caused by: org.apache.hudi.exception.HoodieException: 
org.apache.hudi.exception.HoodieException: Error occurs when executing map
        at 
jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) 
~[?:?]
        at 
jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:77)
 ~[?:?]
        at 
jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
 ~[?:?]
        at 
java.lang.reflect.Constructor.newInstanceWithCaller(Constructor.java:499) ~[?:?]
        at java.lang.reflect.Constructor.newInstance(Constructor.java:480) 
~[?:?]
        at 
java.util.concurrent.ForkJoinTask.getThrowableException(ForkJoinTask.java:562) 
~[?:?]
        at 
java.util.concurrent.ForkJoinTask.reportException(ForkJoinTask.java:591) ~[?:?]
        at java.util.concurrent.ForkJoinTask.invoke(ForkJoinTask.java:689) 
~[?:?]
        at 
java.util.stream.ReduceOps$ReduceOp.evaluateParallel(ReduceOps.java:927) ~[?:?]
        at 
java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:233) ~[?:?]
        at 
java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:682) ~[?:?]
        at 
org.apache.hudi.common.engine.HoodieLocalEngineContext.map(HoodieLocalEngineContext.java:90)
 ~[hudi-spark3.5-bundle_2.12-1.0.1.jar:1.0.1]
        at 
org.apache.hudi.metadata.HoodieBackedTableMetadata.getAllRecordsByKeys(HoodieBackedTableMetadata.java:322)
 ~[hudi-spark3.5-bundle_2.12-1.0.1.jar:1.0.1]
        at 
org.apache.hudi.metadata.BaseTableMetadata.readRecordIndex(BaseTableMetadata.java:299)
 ~[hudi-spark3.5-bundle_2.12-1.0.1.jar:1.0.1]
        at 
org.apache.hudi.index.SparkMetadataTableRecordIndex$RecordIndexFileGroupLookupFunction.call(SparkMetadataTableRecordIndex.java:166)
 ~[hudi-spark3.5-bundle_2.12-1.0.1.jar:1.0.1]
        at 
org.apache.hudi.index.SparkMetadataTableRecordIndex$RecordIndexFileGroupLookupFunction.call(SparkMetadataTableRecordIndex.java:153)
 ~[hudi-spark3.5-bundle_2.12-1.0.1.jar:1.0.1]
        at 
org.apache.spark.api.java.JavaRDDLike.$anonfun$mapPartitionsToPair$1(JavaRDDLike.scala:186)
 ~[spark-core_2.12-3.5.2.jar:3.5.2]
        at org.apache.spark.rdd.RDD.$anonfun$mapPartitions$2(RDD.scala:858) 
~[spark-core_2.12-3.5.2.jar:3.5.2]
        at 
org.apache.spark.rdd.RDD.$anonfun$mapPartitions$2$adapted(RDD.scala:858) 
~[spark-core_2.12-3.5.2.jar:3.5.2]
        at 
org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52) 
~[spark-core_2.12-3.5.2.jar:3.5.2]
        at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) 
~[spark-core_2.12-3.5.2.jar:3.5.2]
        at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) 
~[spark-core_2.12-3.5.2.jar:3.5.2]
        at 
org.apache.spark.shuffle.ShuffleWriteProcessor.write(ShuffleWriteProcessor.scala:59)
 ~[spark-core_2.12-3.5.2.jar:3.5.2]
        at 
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:104) 
~[spark-core_2.12-3.5.2.jar:3.5.2]
        at 
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:54) 
~[spark-core_2.12-3.5.2.jar:3.5.2]
        at 
org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:166) 
~[spark-core_2.12-3.5.2.jar:3.5.2]
        at org.apache.spark.scheduler.Task.run(Task.scala:141) 
~[spark-core_2.12-3.5.2.jar:3.5.2]
        at 
org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:620)
 ~[spark-core_2.12-3.5.2.jar:3.5.2]
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64)
 ~[spark-common-utils_2.12-3.5.2.jar:3.5.2]
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61)
 ~[spark-common-utils_2.12-3.5.2.jar:3.5.2]
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:94) 
~[spark-core_2.12-3.5.2.jar:3.5.2]
        at 
org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:623) 
~[spark-core_2.12-3.5.2.jar:3.5.2]
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) 
~[?:?]
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) 
~[?:?]
        at java.lang.Thread.run(Thread.java:833) ~[?:?]
   Caused by: org.apache.hudi.exception.HoodieException: Error occurs when 
executing map
        at 
org.apache.hudi.common.function.FunctionWrapper.lambda$throwingMapWrapper$0(FunctionWrapper.java:40)
 ~[hudi-spark3.5-bundle_2.12-1.0.1.jar:1.0.1]
        at 
java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:197) ~[?:?]
        at 
java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1625) 
~[?:?]
        at 
java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:509) ~[?:?]
        at 
java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:499) 
~[?:?]
        at java.util.stream.ReduceOps$ReduceTask.doLeaf(ReduceOps.java:960) 
~[?:?]
        at java.util.stream.ReduceOps$ReduceTask.doLeaf(ReduceOps.java:934) 
~[?:?]
        at java.util.stream.AbstractTask.compute(AbstractTask.java:327) ~[?:?]
        at 
java.util.concurrent.CountedCompleter.exec(CountedCompleter.java:754) ~[?:?]
        at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:373) 
~[?:?]
        at 
java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1182)
 ~[?:?]
        at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1655) ~[?:?]
        at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1622) 
~[?:?]
        at 
java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:165) 
~[?:?]
   Caused by: org.apache.hudi.exception.HoodieException: Exception when reading 
log file 
        at 
org.apache.hudi.common.table.log.AbstractHoodieLogRecordScanner.scanInternalV2(AbstractHoodieLogRecordScanner.java:604)
 ~[hudi-spark3.5-bundle_2.12-1.0.1.jar:1.0.1]
        at 
org.apache.hudi.common.table.log.AbstractHoodieLogRecordScanner.scanInternal(AbstractHoodieLogRecordScanner.java:248)
 ~[hudi-spark3.5-bundle_2.12-1.0.1.jar:1.0.1]
        at 
org.apache.hudi.common.table.log.HoodieMergedLogRecordScanner.scanByFullKeys(HoodieMergedLogRecordScanner.java:163)
 ~[hudi-spark3.5-bundle_2.12-1.0.1.jar:1.0.1]
        at 
org.apache.hudi.metadata.HoodieMetadataLogRecordReader.getAllRecordsByKeys(HoodieMetadataLogRecordReader.java:127)
 ~[hudi-spark3.5-bundle_2.12-1.0.1.jar:1.0.1]
        at 
org.apache.hudi.metadata.HoodieBackedTableMetadata.readAllLogRecords(HoodieBackedTableMetadata.java:477)
 ~[hudi-spark3.5-bundle_2.12-1.0.1.jar:1.0.1]
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@hudi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Reply via email to