Dear spark developers,

I did not find rolling policy for spark event logs,for long living
streaming applications the event log is becoming huge with in few hours.
The only way I found is fixing the existing event logging spark
listener(EventLoggingListener),
but I believe there should be already a solution tackling this problem
because it is a frequently occurring problem when ever somebody runs
streaming applications for long time.

What would be the right way to fix this,whether any config settings are
available for this or writing our own scripts to cleanup the log directory
regularly based on time/size or we need to implement rolling policy feature?

Regards
Omkar

Reply via email to