Zouxxyy opened a new pull request, #13041:
URL: https://github.com/apache/hudi/pull/13041

   ### Change Logs
   
   Fix bulk insert overwrite after a failed insert
   
   ```sql
   set hoodie.spark.sql.insert.into.operation=bulk_insert;
   
   create table t(
     id int,
     name string,
     dt int
   ) using hudi
    tblproperties (
     type = 'cow',
     primaryKey = 'id'
    ) partitioned by (dt);
   
   -- kill this
   insert overwrite table t partition (dt) values(1, 'a1', 10);
   
   -- rerun will raise exception
   insert overwrite table t partition (dt) values(1, 'a1', 10);
   ```
   
   
   ```
   Caused by: org.apache.hudi.exception.HoodieIOException: Failed to check 
emptiness of instant [==>20250327073905540__replacecommit__REQUESTED]
        at 
org.apache.hudi.common.table.timeline.TimelineUtils.isEmpty(TimelineUtils.java:616)
        at 
org.apache.hudi.common.table.timeline.versioning.v2.ActiveTimelineV2.isEmpty(ActiveTimelineV2.java:755)
        at 
org.apache.hudi.common.util.ClusteringUtils.isEmptyReplaceOrClusteringInstant(ClusteringUtils.java:226)
        at 
org.apache.hudi.common.util.ClusteringUtils.getRequestedReplaceMetadata(ClusteringUtils.java:198)
        at 
org.apache.hudi.common.util.ClusteringUtils.getClusteringPlan(ClusteringUtils.java:259)
        at 
org.apache.hudi.common.util.ClusteringUtils.getClusteringPlan(ClusteringUtils.java:248)
        at 
org.apache.hudi.common.util.ClusteringUtils.lambda$getAllPendingClusteringPlans$0(ClusteringUtils.java:87)
        at 
java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193)
        at 
java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1384)
        at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:482)
        at 
java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:472)
        at 
java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708)
        at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
        at 
java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:566)
        at 
org.apache.hudi.common.util.ClusteringUtils.getAllFileGroupsInPendingClusteringPlans(ClusteringUtils.java:282)
        ... 45 more
   Caused by: java.io.FileNotFoundException: File 
file:/private/var/folders/px/y3gybll50ggctcjp2t4r2b500000gp/T/spark-c259d9c8-c882-46c6-b52b-722c77922855/h1/.hoodie/timeline/20250327073905540.replacecommit.requested
 does not exist
        at 
org.apache.hadoop.fs.RawLocalFileSystem.deprecatedGetFileStatus(RawLocalFileSystem.java:779)
        at 
org.apache.hadoop.fs.RawLocalFileSystem.getFileLinkStatusInternal(RawLocalFileSystem.java:1100)
        at 
org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:769)
        at 
org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:462)
        at 
org.apache.hudi.hadoop.fs.HoodieWrapperFileSystem.lambda$getFileStatus$17(HoodieWrapperFileSystem.java:415)
        at 
org.apache.hudi.hadoop.fs.HoodieWrapperFileSystem.executeFuncWithTimeMetrics(HoodieWrapperFileSystem.java:118)
        at 
org.apache.hudi.hadoop.fs.HoodieWrapperFileSystem.getFileStatus(HoodieWrapperFileSystem.java:409)
        at 
org.apache.hudi.storage.hadoop.HoodieHadoopStorage.getPathInfo(HoodieHadoopStorage.java:170)
        at 
org.apache.hudi.common.table.timeline.TimelineUtils.isEmpty(TimelineUtils.java:613)
        ... 59 more
   ```
   
   ### Impact
   
   Fix this
   
   ### Risk level (write none, low medium or high below)
   
   low
   
   ### Documentation Update
   
   none
   
   ### Contributor's checklist
   
   - [ ] Read through [contributor's 
guide](https://hudi.apache.org/contribute/how-to-contribute)
   - [ ] Change Logs and Impact were stated clearly
   - [ ] Adequate tests were added if applicable
   - [ ] CI passed
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@hudi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org

Reply via email to