JoshuaZhuCN commented on issue #3981: URL: https://github.com/apache/hudi/issues/3981#issuecomment-979846358
> @JoshuaZhuCN spark should support read pure log table. When you specify the load path, do not use wildcards, just specify the path to the table level。 load("hdfs://localhost:9000/hoodie/tb_hbase_test/default/*") -> load("hdfs://localhost:9000/hoodie/tb_hbase_test") @xiarixiaoyao Thanks for the idea, I reproduce this scene: 1, using the old version of the read logic, using wildcards to read data, will not be able to read pure log files 2, the logic of reading using the new version, not wildcard, pure can read the log file (DataSourceWriteOptions must be set. While writing PARTITIONPATH_FIELD parameters, otherwise an error) -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@hudi.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org