Yes, the dynamic log level modification worked great for me.
Thanks a lot,
Vadim
From: Biao Geng
Date: Tuesday, 14 May 2024 at 10:07
To: Vararu, Vadim
Cc: user@flink.apache.org
Subject: Re: Proper way to modify log4j config file for kubernetes-session
Hi Vararu,
Does this document meet your
Hi Vararu,
Does this document meet your requirements?
https://nightlies.apache.org/flink/flink-docs-master/docs/deployment/resource-providers/native_kubernetes/#logging
Best,
Biao Geng
Vararu, Vadim 于2024年5月14日周二 01:39写道:
> Hi,
>
>
>
> Trying to configure logger
Hi,
Trying to configure loggers in the log4j-console.properties file (that is
mounted from the host where the kubernetes-session.sh is invoked and referenced
by the TM processes via - Dlog4j.configurationFile).
Is there a proper (documented) way to do that, meaning to append/modify the
log4j
I assume you are using "*bin/flink run-application*" to submit a Flink
application to K8s cluster. Then you could simply
update your local log4j-console.properties, it will be shipped and mounted
to JobManager/TaskManager pods via ConfigMap.
Best,
Yang
Vladislav Keda 于2023年6月20日
Hi all again!
Please tell me if you can answer my question, thanks.
---
Best Regards,
Vladislav Keda
пт, 16 июн. 2023 г. в 16:12, Vladislav Keda <
vladislav.k...@glowbyteconsulting.com>:
> Hi all!
>
> Is it possible to change Flink* log4j-console.properties* in Native
&g
Hi all!
Is it possible to change Flink* log4j-console.properties* in Native
Kubernetes (for example in Kubernetes Application mode) without rebuilding
the application docker image?
I was trying to inject a .sh script call (in the attachment) before
/docker-entrypoint.sh, but this workaround did
>
> I checked the image prior cluster creation; all logs' files are there.
> once the cluster is deployed, they are missing. (bug?)
I do not think it is a bug since we already have shipped all the config
files(log4j properties, flink-conf.yaml) via the ConfigMap.
Then it is direc
s deployed, they are missing. (bug?)
Best,
Tamir.
From: Tamir Sagi
Sent: Friday, January 21, 2022 7:19 PM
To: Yang Wang
Cc: user@flink.apache.org
Subject: Re: Flink 1.14.2 - Log4j2 -Dlog4j.configurationFile is ignored and
falls back to default /opt/flink/
/log4j-console.properties
EXTERNAL EMAIL
Changing the order of exec command makes sense to me. Would you please create a
ticket for this?
The /opt/flink/conf is cleaned up because we are mounting the conf files from
K8s ConfigMap.
Best,
Yang
Tamir Sagi mailto:tamir.s...@niceactimize.com
is better). does it make
> sense to you?
>
>
> In addition, any idea why /opt/flink/conf gets cleaned (Only
> flink-conf.xml is there).
>
>
> Best,
> Tamir
>
>
> ------
> *From:* Yang Wang
> *Sent:* Tuesday, January 18, 2022 6:02 AM
ion, any idea why /opt/flink/conf gets cleaned (Only flink-conf.xml is
there).
Best,
Tamir
From: Yang Wang
Sent: Tuesday, January 18, 2022 6:02 AM
To: Tamir Sagi
Cc: user@flink.apache.org
Subject: Re: Flink 1.14.2 - Log4j2 -Dlog4j.configurationFile is ign
M/TM start
command, but the jobmanager.sh/taskmanager.sh. We do not
have the same logic in the "flink-console.sh".
Maybe we could introduce an environment for log configuration file name in
the "flink-console.sh". The default value could be
"log4j-console.properties"
org.apache.flink.kubernetes.kubeclient.parameters#hasLog4j returns
false then logging args are not added to startCommand.
1. why does the config dir gets cleaned once the cluster starts? Even when I
pushed log4j-console.properties to the expected location (/opt/flink/conf) ,
the directory includes only flink-conf.yaml.
2. I think
I think the root cause is that we are using "flink-console.sh" to start the
JobManager/TaskManager process for native K8s integration after
FLINK-21128[1].
So it forces the log4j configuration name to be "log4j-console.properties".
[1]. https://issues.apache.org/jira/brows
Hey All
I'm Running Flink 1.14.2, it seems like it ignores system property
-Dlog4j.configurationFile and
falls back to /opt/flink/conf/log4j-console.properties
I enabled debug log for log4j2 ( -Dlog4j2.debug)
DEBUG StatusLogger Catching
java.io.FileNotFoundException: file:/opt/flink
Hi Eddie,
the APIs should be binary compatible across patch releases, so there is no
need to re-compile your artifacts
Best,
D.
On Sun 19. 12. 2021 at 16:42, Colletta, Edward
wrote:
> If have jar files built using flink version 11.2 in dependencies, and I
> upgrade my cluster to 11.6, is it sa
If have jar files built using flink version 11.2 in dependencies, and I upgrade
my cluster to 11.6, is it safe to run the existing jars on the upgraded cluster
or should I rebuild all jobs against 11.6?
Thanks,
Eddie Colletta
I realised there is an Apache Log4j mailing list.
Regards,
Mr. Turritopsis Dohrnii Teo En Ming
Targeted Individual in Singapore
19 Dec 2021 Sunday
On Fri, 17 Dec 2021 at 00:29, Arvid Heise wrote:
>
> I think this is meant for the Apache log4j mailing list [1].
>
&g
Hi,
Please refer to this link.
Article: Log4j zero-day flaw: What you need to know and how to protect yourself
Link:
https://www.zdnet.com/article/log4j-zero-day-flaw-what-you-need-to-know-and-how-to-protect-yourself/
The article says:
[QUOTE]
WHAT DEVICES AND APPLICATIONS ARE AT RISK
I think this is meant for the Apache log4j mailing list [1].
[1] https://logging.apache.org/log4j/2.x/mail-lists.html
On Thu, Dec 16, 2021 at 4:07 PM David Morávek wrote:
> Hi Turritopsis,
>
> I fail to see any relation to Apache Flink. Can you please elaborate on
> how Flink
tware has
> log4j zero-day security vulnerability?
>
> Good day from Singapore,
>
> I am working for a Systems Integrator (SI) in Singapore. We have
> several clients writing in, requesting us to identify log4j zero-day
> security vulnerability in their corporate infrastruct
Subject: How do I determine which hardware device and software has
log4j zero-day security vulnerability?
Good day from Singapore,
I am working for a Systems Integrator (SI) in Singapore. We have
several clients writing in, requesting us to identify log4j zero-day
security vulnerability in their
Dear Flink Community,
Yesterday, a new Zero Day for Apache Log4j was reported [1]. It is now
tracked under CVE-2021-44228 [2].
Apache Flink bundles a version of Log4j that is affected by this
vulnerability. We recommend users to follow the advisory [3] of the Apache
Log4j Community. For Apache
org.apache.hadoop
>> hadoop-mapreduce-client-core
>> 3.2.0
>>
>>
>>
>>
>>
>>
>> spring-repo
>> https://repo1.maven.org/maven2/
>>
>>
>>
>>
>>
>>
>>
&g
aven-compiler-plugin
> 3.1
>
> 1.8
> 1.8
>
>
>
>
>
>
>
>
> org.apache.maven.plugins
> maven-shade-plugin
> 3.0.
package
shade
org.apache.flink:force-shading
com.google.code.findbugs:jsr305
org.slf4j:*
Hi Ragini,
I think you actually have the opposite problem that your classpath contains
slf4j binding for log4j 1.2, which is no longer supported. Can you try
getting rid of the slf4j-log4j12 dependency?
Best,
D.
On Tue, Sep 14, 2021 at 1:51 PM Ragini Manjaiah
wrote:
> when I try to run fl
when I try to run flink .1.13 application encountering the below mentioned
issue. what dependency I am missing . can you please help me
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in
[jar:file:/Users/z004t01/.m2/repository/org/apache/logging/log4j/log4j-slf4j-impl
it possible to have a tracking id in MDC that will be shared across
> chained users defined operations like Filter, KeySelector, Flat map,
> Process function, and Producer?
>
> Tracking id will be read from headers of Kafka Message, which if possible
> plan to set to MDC in log4j. R
Hi All,
Is it possible to have a tracking id in MDC that will be shared across
chained users defined operations like Filter, KeySelector, Flat map,
Process function, and Producer?
Tracking id will be read from headers of Kafka Message, which if possible
plan to set to MDC in log4j. Right now I
Any idea on how I can use log back instead ?
On Fri, Aug 23, 2019 at 1:22 PM Vishwas Siravara
wrote:
> Hi ,
> From the flink doc , in order to use logback instead of log4j " Users
> willing to use logback instead of log4j can just exclude log4j (or delete
> it from the lib/
Hi ,
>From the flink doc , in order to use logback instead of log4j " Users
willing to use logback instead of log4j can just exclude log4j (or delete
it from the lib/ folder)."
https://ci.apache.org/projects/flink/flink-docs-stable/monitoring/logging.html
.
However when i delete it
>> Hi
>>
>> I have a use case in which i want to log bad records in the log file. I
>> have configured the log4j
>> property file is getting generated as well but it also going to flink
>> logs as well i want to detach
>> it from flink logs want to write
flink/flink-docs-release-1.4/dev/stream/side_output.html
2018-03-20 10:36 GMT+01:00 Puneet Kinra :
> Hi
>
> I have a use case in which i want to log bad records in the log file. I
> have configured the log4j
> property file is getting generated as well but it also going to flink logs
Hi
I have a use case in which i want to log bad records in the log file. I
have configured the log4j
property file is getting generated as well but it also going to flink logs
as well i want to detach
it from flink logs want to write to log file.
.Here is configuration
*(Note :AMSSource is the
I didn't find an example of flink-log4j configuration while creating EMR
cluster for running Flink. What should be passed to "flink-log4j" config?
Actual log4j config or path to file? Also, how to see application logs in
EMR?
thanks
Ishwara Varnasi
90883 Dec 9 20:13
>> flink-python_2.10-1.1.3.jar
>> -rw-r--r-- 1 robert robert60547 Dec 9 18:45 log4j-1.2-api-2.7.jar
>> -rw-rw-r-- 1 robert robert 1638598 Oct 22 16:08
>> log4j2-gelf-1.3.1-shaded.jar
>> -rw-rw-r-- 1 robert robert 1056 Dec 9 20:12 log4j2
79966937 Oct 10 13:49 flink-dist_2.10-1.1.3.jar
> -rw-r--r-- 1 robert robert90883 Dec 9 20:13
> flink-python_2.10-1.1.3.jar
> -rw-r--r-- 1 robert robert60547 Dec 9 18:45 log4j-1.2-api-2.7.jar
> -rw-rw-r-- 1 robert robert 1638598 Oct 22 16:08
> log4j2-gelf-1.3.1-shaded.jar
&g
log4j-1.2-api-2.7.jar
-rw-rw-r-- 1 robert robert 1638598 Oct 22 16:08
log4j2-gelf-1.3.1-shaded.jar
-rw-rw-r-- 1 robert robert 1056 Dec 9 20:12 log4j2.properties
-rw-r--r-- 1 robert robert 219001 Dec 9 18:45 log4j-api-2.7.jar
-rw-r--r-- 1 robert robert 1296865 Dec 9 18:45 log4j-core-2.7
I read through the link you provided, Stephan. However, I am still confused. The instructions mention specific jar files for Logback, I am not sure which of the log4j 2.x jars I need to put in the the flink /lib directory. I tried various combinations of log4j-1.2-api-2.8.jar, log4j-slf4j-impl-2.8
ile.
On Thu, Feb 16, 2017 at 11:54 AM, Stephan Ewen wrote:
> Hi!
>
> The bundled log4j version (1.x) does not support that.
>
> But you can replace the logging jars with those of a different framework
> (like log4j 2.x), which supports changing the configuration without
> stop
Hi!
The bundled log4j version (1.x) does not support that.
But you can replace the logging jars with those of a different framework
(like log4j 2.x), which supports changing the configuration without
stopping the application.
You don't need to rebuild flink, simply replace two jars in the
Is there a way to reload a log4j.properties file without stopping and starting the job server?
Hi Nick,
the name of the "log4j-yarn-session.properties" file might be a bit
misleading. The file is just used for the YARN session client, running
locally.
The Job- and TaskManager are going to use the log4j.properties on the
cluster.
On Fri, Mar 11, 2016 at 7:20 PM, Ufuk Celebi wro
specific
> log4j.properties to have them honored when running on a YARN cluster? In my
> application jar doesn't work. In the log4j files under flink/conf doesn't
> work.
>
> My goal is to set the log level for 'com.mycompany' classes used in my flink
> application to DEBUG.
>
> Thanks,
> Nick
>
Can anyone tell me where I must place my application-specific
log4j.properties to have them honored when running on a YARN cluster? In my
application jar doesn't work. In the log4j files under flink/conf doesn't
work.
My goal is to set the log level for 'com.mycompany' clas
as an additional note: Flink is sending all files in the /lib folder to all
YARN containers. So you could place the XML file in "/lib" and override the
properties.
I think you need to delete the log4j properties from the conf/ directory,
then at least on YARN, we'
the dynamic
property using Flink’s env.java.opts configuration parameter.
Cheers,
Till
On Mon, Dec 21, 2015 at 3:34 PM, Gwenhael Pasquiers <
gwenhael.pasqui...@ericsson.com> wrote:
> Hi everybody,
>
>
>
> Could it be possible to have a way to configure log4j with xml files
Hi everybody,
Could it be possible to have a way to configure log4j with xml files ?
I've looked into the code and it looks like the properties files names are
hardcoded. However we have the need to use xml :
- We log everything into ELK (Elasticsearch / Logstash / Kibana)
49 matches
Mail list logo