The logs from the executor are redirected to stdout only because there is a
default log4j.properties that is configured to do so. If you put your
log4j.properties with rolling file appender in the classpath (refer to
Spark docs for that), all the logs will get redirected to a separate files
that wi
Hi TD,
I thought about that but was not sure whether this will have any impact in
spark UI/ Executor runner as it redirects stream to stderr/stdout. But
ideally it should not as it will fetch the log record from stderr file
(which is latest)..
Is my understanding correct?
Thanks,
Sourav
On Tue
You can use RollingFileAppenders in log4j.properties.
http://logging.apache.org/log4j/extras/apidocs/org/apache/log4j/rolling/RollingFileAppender.html
You can have other scripts delete old logs.
TD
On Mon, Mar 24, 2014 at 12:20 AM, Sourav Chandra <
sourav.chan...@livestream.com> wrote:
> Hi,
>
Hi,
I have few questions regarding log file management in spark:
1. Currently I did not find any way to modify the lof file name for
executor/drivers). Its hardcoded as stdout and stderr. Also there is no log
rotation.
In case of streaming application this will grow forever and become
unmanageab