Hi,

When using org.apache.log4j.rolling.RollingFileAppender, it is apparently
not
allowed to set:

    log4j.appender.file.file=${log.file}

It works for me if I remove this property from the log4j.properties file.

Moreover, you have configured:

    log4j.appender.file.RollingPolicy.FileNamePattern =
logs/log.%d{yyyyMMdd-HHmm}.log

This will create the log files in the "logs" directory relative to where you
start the Flink cluster. You may want to change FileNamePattern to an
absolute
path. Also note that the Flink default logging directory is "log" and not
"logs".

Best,
Gary


On Fri, Aug 17, 2018 at 8:28 PM, Navneet Kumar Pandey <navn...@essens.no>
wrote:

> Hi Gary,
>
> Thanks for quick reply.
>
> Following is output of  "cat /usr/lib/flink/conf/log4j.properties"
>
> log4j.rootLogger=INFO,file
>
> # Log all infos in the given file
> log4j.appender.file=org.apache.log4j.rolling.RollingFileAppender
> log4j.appender.file.file=${log.file}
> log4j.appender.file.append=false
> log4j.appender.file.layout=org.apache.log4j.PatternLayout
> log4j.appender.file.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss,SSS}
> %-5p %-60c %x - %m%n
>
> # suppress the irrelevant (wrong) warnings from the netty channel handler
> log4j.logger.org.jboss.netty.channel.DefaultChannelPipeline=ERROR,file
> log4j.appender.file.RollingPolicy = org.apache.log4j.rolling.
> TimeBasedRollingPolicy
> log4j.appender.file.RollingPolicy.FileNamePattern =
> logs/log.%d{yyyyMMdd-HHmm}.log
> log4j.logger.no = DEBUG
>
>
> and I double checked the log4j library is in the lib
>
> [hadoop@ip-XXXXXX lib]$ ls /usr/lib/flink/lib/
> apache-log4j-extras-1.2.17.jar  flink-metrics-datadog-1.4.2.jar
> flink-queryable-state-runtime_2.11-1.4.2.jar  log4j-1.2.17.jar
> flink-dist_2.11-1.4.2.jar       flink-python_2.11-1.4.2.jar
> flink-shaded-hadoop2-uber-1.4.2.jar           slf4j-log4j12-1.7.7.jar
>
> On Fri, Aug 17, 2018 at 5:15 PM, Gary Yao <g...@data-artisans.com> wrote:
>
>> Hello Navneet Kumar Pandey,
>>
>> org.apache.log4j.rolling.RollingFileAppender is part of Apache Extras
>> Companion for Apache log4j [1]. Is that library in your classpath?
>>
>> Are there hints in taskmanager.err?
>>
>> Can you run:
>>
>>     cat /usr/lib/flink/conf/log4j.properties
>>
>> on the EMR master node and show the output?
>>
>> For troubleshooting, you can also try org.apache.log4j.RollingFileAp
>> pender
>> which can roll the file if a certain size is exceeded. An example
>> configuration can be found here (I have not tested it):
>>
>>     https://github.com/apache/flink/pull/5371/files
>>
>> Best,
>> Gary
>>
>> [1] https://logging.apache.org/log4j/extras/
>>
>>
>> On Fri, Aug 17, 2018 at 4:09 PM, Navneet Kumar Pandey <navn...@essens.no>
>> wrote:
>>
>>> I am using Flink in EMR with following configuration.
>>>
>>>  {
>>>       "Classification": "flink-log4j",
>>>       "Properties": {
>>>             "log4j.logger.no":"DEBUG",
>>>             "log4j.appender.file":"org.apache.log4j.rolling.RollingFileA
>>> ppender",
>>>             "log4j.appender.file.RollingPolicy.FileNamePattern":"logs/lo
>>> g.%d{yyyyMMdd-HHmm}.log",
>>>             "log4j.appender.file.RollingPolicy":"org.apache.log4j.rollin
>>> g.TimeBasedRollingPolicy",
>>>             "log4j.appender.file.append":"false",
>>>             "log4j.appender.file.layout":"org.apache.log4j.PatternLayout
>>> ",
>>>             "log4j.appender.file.layout.ConversionPattern":"%d{yyyy-MM-dd
>>> HH:mm:ss,SSS} %-5p %-60c %x - %m%n"
>>>
>>>       }
>>>     }
>>>
>>> FYI this configuration get written into flink's log4j.properties.As you
>>> can see even after this setting taskmanager and jobmanager log files are
>>> not getting rolled.
>>>
>>> [hadoop@ip-XXXXXX ~]$ sudo ls -lh  /mnt/var/log/hadoop-yarn/conta
>>> iners/application_DDDDDDDDDDD_0002/container_DDDDDDDDDDDD_0002_01_000002
>>> total 7.0G
>>> -rw-r--r-- 1 yarn yarn 770K Aug 17 14:02 taskmanager.err
>>> -rw-r--r-- 1 yarn yarn 6.0G Aug 17 14:02 taskmanager.log
>>> -rw-r--r-- 1 yarn yarn 526K Aug 17 13:54 taskmanager.out
>>>
>>> Can somebody help me to give pointer about how to roll these log files?
>>> Note that these files are also being copied into s3.
>>>
>>>
>>
>>
>

Reply via email to