YannByron commented on issue #5932:
URL: https://github.com/apache/hudi/issues/5932#issuecomment-1166798089

   @leoyy0316 spark dataframe and sql can not share the same cache. you delete 
data by spark-sql, but when you get the data by dataframe, spark will use the 
cached `LogicalPlan` to response. Actually, there is an improvement to do for 
hudi that refresh/invalidate the cache for the same source.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to