+1 to Rong’s approach.
Using java option and log4j, we could save the user logs to different file.
Best
Yang
Gyula Fóra 于2019年10月18日周五 下午4:41写道:
> Hi all!
>
> Thanks for the answers, this has been very helpful and we could set up a
> similar scheme using the Env variables.
>
> Cheers,
> Gyula
Hi all!
Thanks for the answers, this has been very helpful and we could set up a
similar scheme using the Env variables.
Cheers,
Gyula
On Tue, Oct 15, 2019 at 9:55 AM Paul Lam wrote:
> +1 to Rong’s approach. We use a similar solution to the log context
> problem
> on YARN setups. FYI.
>
> WRT
+1 to Rong’s approach. We use a similar solution to the log context problem
on YARN setups. FYI.
WRT container contextual informations, we collection logs via ELK so that
the log file paths (which contains application id and container id) and the host
are attached with the logs. But if you don’t
Hi Gyula,
Sorry for the late reply. I think it is definitely a challenge in terms of
log visibility.
However, for your requirement I think you can customize your Flink job by
utilizing a customized log formatter/encoder (e.g. log4j.properties or
logback.xml) and a suitable logger implementation.
Hi all!
We have been thinking that it would be a great improvement to add
contextual information to the Flink logs:
- Container / yarn / host info to JM/TM logs
- Job info (job id/ jobname) to task logs
I this should be similar to how the metric scopes are set up and should be
able to provide