[ 
https://issues.apache.org/jira/browse/FLINK-6019?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16501456#comment-16501456
 ] 

Luke Hutchison commented on FLINK-6019:
---------------------------------------

Unfortunately I am so unknowledgeable about the complexities of Java logging 
that I would not be the right person to create the doc changes, I'd probably 
get something wrong. I'll tell you what I figured out though:

* Yes, you cannot have a log4j-to-slf4j and an slf4j-to-log4j bridge in the 
classpath at the same time, without creating an infinite loop.
* As far as I can tell though, there is no bridge installed by default (at 
least not for the Flink libraries I was using).
* slf4j is the way forward, since it unifies all the logging frameworks -- it 
can work as middleware between any logger frontend and any logger backend. (I 
didn't know this before...)  It also has its own logging API, so it can serve 
as its own frontend.

Maybe it's enough to simply make the observation in the logging pages that 
there can be multiple loggers in the system, used by different libraries in the 
classpath or module path, and that if you want all log info to be sent to one 
place, with the same formatting, then you will need to install a bridge that 
sends all the output to one of the loggers. Giving the names of a few bridges 
might help direct people to the right places to go to start looking for this. 
(I had no idea what was wrong or how to start looking for info to fix this...)

The Maven dependencies I ended up were was the slf4j logging API (for logging 
in my own code), and then the slf4j-log4j12 bridge for sending slf4j logging to 
log4j.

{code}
                <dependency>
                        <groupId>org.slf4j</groupId>
                        <artifactId>slf4j-api</artifactId>
                        <version>1.7.25</version>
                </dependency>
                <dependency>
                        <groupId>org.slf4j</groupId>
                        <artifactId>slf4j-log4j12</artifactId>
                        <version>1.7.25</version>
                </dependency>
{code}

The log4j.properties that I ended up with was

{code}
# Root logger (default)
log4j.rootLogger=INFO, stdout

# Direct log messages to stdout
log4j.appender.stdout=org.apache.log4j.ConsoleAppender
log4j.appender.stdout.Target=System.out
log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss} %-5p 
%C{1}:%L\t%m%n

# Change to INFO for verbose info about Flink pipeline as it runs
log4j.logger.org.apache.flink=WARN

# Specifically limit other chatty loggers here (e.g. AWS logs every byte 
transferred in INFO mode)
log4j.logger.org.apache.http=WARN
log4j.logger.com.amazonaws=WARN
{code}



> Some log4j messages do not have a loglevel field set, so they can't be 
> suppressed
> ---------------------------------------------------------------------------------
>
>                 Key: FLINK-6019
>                 URL: https://issues.apache.org/jira/browse/FLINK-6019
>             Project: Flink
>          Issue Type: Bug
>          Components: Core
>    Affects Versions: 1.2.0
>         Environment: Linux
>            Reporter: Luke Hutchison
>            Priority: Major
>
> Some of the log messages do not appear to have a loglevel value set, so they 
> can't be suppressed by setting the log4j level to WARN. There's this line at 
> the beginning which doesn't even have a timestamp:
> {noformat}
> Connected to JobManager at Actor[akka://flink/user/jobmanager_1#1844933939]
> {noformat}
> And then there are numerous lines like this, missing an "INFO" field:
> {noformat}
> 03/10/2017 00:01:14   Job execution switched to status RUNNING.
> 03/10/2017 00:01:14   DataSource (at readTable(DBTableReader.java:165) 
> (org.apache.flink.api.java.io.PojoCsvInputFormat))(1/8) switched to SCHEDULED 
> 03/10/2017 00:01:14   DataSink (count())(1/8) switched to SCHEDULED 
> 03/10/2017 00:01:14   DataSink (count())(3/8) switched to DEPLOYING 
> 03/10/2017 00:01:15   DataSink (count())(3/8) switched to RUNNING 
> 03/10/2017 00:01:17   DataSink (count())(6/8) switched to FINISHED 
> 03/10/2017 00:01:17   DataSource (at readTable(DBTableReader.java:165) 
> (org.apache.flink.api.java.io.PojoCsvInputFormat))(6/8) switched to FINISHED 
> 03/10/2017 00:01:17   Job execution switched to status FINISHED.
> {noformat}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

Reply via email to