punish-yh opened a new issue #4204:
URL: https://github.com/apache/hudi/issues/4204


   **_Tips before filing an issue_**
   
   - Have you gone through our 
[FAQs](https://cwiki.apache.org/confluence/display/HUDI/FAQ)?
   yes
   - Join the mailing list to engage in conversations and get faster support at 
dev-subscr...@hudi.apache.org.
   
   - If you have triaged this as a bug, then file an 
[issue](https://issues.apache.org/jira/projects/HUDI/issues) directly.
   
   **Describe the problem you faced**
   
   when i use flink sql insert data from flink cdc2.1 to hudi , at first it is 
undercontrol, after a while it throw a InvalidAvroMagicException.
   
   
   
   A clear and concise description of the problem.
   
   **To Reproduce**
   
   Steps to reproduce the behavior:
   
   1.  start flink standalone and sql-client
   ````
   ./bin/start-cluster.sh
   ./bin/sql-client.sh embedded -j ./hudi-flink-bundle_2.11-0.10.0-rc2.jar shell
   ````
   2. create  table 
   ````
   CREATE TABLE t_abnormal_0 (
       id BIGINT,
       eid STRING,
       seq_no INT,
       name STRING,
       reg_no STRING,
       province STRING,
       in_reason String,
       out_reason STRING,
       out_date STRING,
       department STRING,
       last_update_time BIGINT,
       ops_flag INT,
       u_tags INT,
       obj_id STRING,
       row_update_time TIMESTAMP(0),
       out_department STRING,
       PRIMARY KEY(id) NOT ENFORCED
   ) WITH (
       'connector' = 'mysql-cdc',
       'hostname' = 'hostname',
       'port' = '3306',
       'username' = 'username',
       'password' = 'password',
       'database-name' = 'dbname',
       'table-name' = 'tablename',
       'scan.startup.mode' = 'initial'
   );
   
   CREATE TABLE hudi_abnormal_0(
       id BIGINT,
       eid STRING,
       seq_no INT,
       name STRING,
       reg_no STRING,
       province STRING,
       in_reason String,
       out_reason STRING,
       out_date STRING,
       department STRING,
       last_update_time BIGINT,
       ops_flag INT,
       u_tags INT,
       obj_id STRING,
       row_update_time TIMESTAMP(0),
       out_department STRING,
       PRIMARY KEY(id) NOT ENFORCED
   )WITH (
       'connector' = 'hudi',
       'path' = 'hdfs://localhost:9000/flink-hudi/hudi_abnormal_0',
       'table.type' = 'MERGE_ON_READ',
       'read.streaming.enabled' = 'true'
   );
   ````
   3. execute insert sql
   ````
   INSERT INTO hudi_abnormal_0 SELECT * FROM t_abnormal_0;
   ````
   
   **Expected behavior**
   
   A clear and concise description of what you expected to happen.
   
   **Environment Description**
   
   * Hudi version : hudi 0.10.0 (hudi-flink-bundle_2.11-0.10.0-rc2.jar)
   
   * Spark version : no 
   
   * Hive version : no 
   
   * Flink version:  flink 1.13.3 with scala2.11
   
   * Hadoop version : 
   
   * Storage (HDFS/S3/GCS..) : HDFS
   
   * Running on Docker? (yes/no) : no
   
   
   **Additional context**
   
   Add any other context about the problem here.
   
   **Stacktrace**
   
   ```Add the stacktrace of the error.```
   
   org.apache.flink.util.FlinkException: Global failure triggered by 
OperatorCoordinator for 'hoodie_stream_write' (operator 
5d50644a659cc89165fd5f9dbd7f2b80).
        at 
org.apache.flink.runtime.operators.coordination.OperatorCoordinatorHolder$LazyInitializedCoordinatorContext.failJob(OperatorCoordinatorHolder.java:553)
        at 
org.apache.hudi.sink.StreamWriteOperatorCoordinator.lambda$start$0(StreamWriteOperatorCoordinator.java:170)
        at 
org.apache.hudi.sink.utils.NonThrownExecutor.lambda$execute$0(NonThrownExecutor.java:103)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
   Caused by: org.apache.hudi.exception.HoodieException: Executor executes 
action [initialize instant 20211203151349684] error
        ... 5 more
   Caused by: org.apache.hudi.exception.HoodieIOException: Fetching rollback 
plan failed for [==>20211203151924723__rollback__REQUESTED]
        at 
org.apache.hudi.client.AbstractHoodieWriteClient.lambda$getPendingRollbackInfos$8(AbstractHoodieWriteClient.java:877)
        at 
java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193)
        at 
java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1384)
        at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:482)
        at 
java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:472)
        at 
java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708)
        at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
        at 
java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:499)
        at 
org.apache.hudi.client.AbstractHoodieWriteClient.getPendingRollbackInfos(AbstractHoodieWriteClient.java:880)
        at 
org.apache.hudi.client.AbstractHoodieWriteClient.rollbackFailedWrites(AbstractHoodieWriteClient.java:897)
        at 
org.apache.hudi.client.AbstractHoodieWriteClient.rollbackFailedWrites(AbstractHoodieWriteClient.java:887)
        at 
org.apache.hudi.client.AbstractHoodieWriteClient.lambda$startCommitWithTime$97cdbdca$1(AbstractHoodieWriteClient.java:780)
        at 
org.apache.hudi.common.util.CleanerUtils.rollbackFailedWrites(CleanerUtils.java:143)
        at 
org.apache.hudi.client.AbstractHoodieWriteClient.startCommitWithTime(AbstractHoodieWriteClient.java:779)
        at 
org.apache.hudi.client.AbstractHoodieWriteClient.startCommitWithTime(AbstractHoodieWriteClient.java:772)
        at 
org.apache.hudi.sink.StreamWriteOperatorCoordinator.startInstant(StreamWriteOperatorCoordinator.java:334)
        at 
org.apache.hudi.sink.StreamWriteOperatorCoordinator.lambda$initInstant$5(StreamWriteOperatorCoordinator.java:361)
        at 
org.apache.hudi.sink.utils.NonThrownExecutor.lambda$execute$0(NonThrownExecutor.java:93)
        ... 3 more
   Caused by: org.apache.hudi.org.apache.avro.InvalidAvroMagicException: Not an 
Avro data file
        at 
org.apache.hudi.org.apache.avro.file.DataFileReader.openReader(DataFileReader.java:56)
        at 
org.apache.hudi.common.table.timeline.TimelineMetadataUtils.deserializeAvroMetadata(TimelineMetadataUtils.java:183)
        at 
org.apache.hudi.table.action.rollback.RollbackUtils.getRollbackPlan(RollbackUtils.java:68)
        at 
org.apache.hudi.client.AbstractHoodieWriteClient.lambda$getPendingRollbackInfos$8(AbstractHoodieWriteClient.java:874)
        ... 20 more
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@hudi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Reply via email to