Hey there. I have my spark applications set up to write their event logs into S3 - this is super useful for ephemeral clusters, I can have persistent history even though my hosts go away.
A history server is set up to view this s3 location, and that works fine too - at least on startup. The problem is that the history server doesn't seem to notice new logs arriving into the S3 bucket. Any idea how I can get it to scan the folder for new files? Thanks, -miles