it's easy, just restart your flink cluster(standalone mode)
if you run flink in yarn mode,then the result will display on $HADOOP/logs/*.out files ------------------ ???????? ------------------ ??????: "sidhant gupta" <sidhan...@gmail.com>; ????????: 2020??10??7??(??????) ????9:52 ??????: "??????"<appleyu...@foxmail.com>; ????: "user"<user@flink.apache.org>; ????: Re: The file STDOUT does not exist on the TaskExecutor ++ user On Wed, Oct 7, 2020, 6:47 PM sidhant gupta <sidhan...@gmail.com> wrote: Hi I checked in the $FLINK_HOME/logs. The .out file was not there. Can you suggest what should be the action item ? Thanks Sidhant Gupta On Wed, Oct 7, 2020, 7:17 AM ?????? <appleyu...@foxmail.com> wrote: check if the .out file is in $FLINK_HOME/logs please. ------------------ ???????? ------------------ ??????: "sidhant gupta" <sidhan...@gmail.com>; ????????: 2020??10??7??(??????) ????1:52 ??????: "??????"<appleyu...@foxmail.com>; ????: Re: The file STDOUT does not exist on the TaskExecutor Hi, I am just running the docker container as it is by adding just the conf/flink.yaml . I am not sure if the .out file got deleted. Do we need to expose some ports ? Thanks Sidhant Gupta On Tue, Oct 6, 2020, 8:51 PM ?????? <appleyu...@foxmail.com> wrote: Hi,I guess you may deleted .out file in $FLINK_HOME/logs. you can just use your default log settings. ------------------ ???????? ------------------ ??????: "sidhant gupta" <sidhan...@gmail.com>; ????????: 2020??10??6??(??????) ????10:59 ??????: "user"<user@flink.apache.org>; ????: The file STDOUT does not exist on the TaskExecutor Hi, I am running dockerized flink:1.11.0-scala_2.11 container in ecs. I am getting the following error after the job runs: ERROR org.apache.flink.runtime.rest.handler.taskmanager.TaskManagerStdoutFileHandler [] - Unhandled exception. org.apache.flink.util.FlinkException: The file STDOUT does not exist on the TaskExecutor. at org.apache.flink.runtime.taskexecutor.TaskExecutor.lambda$requestFileUploadByFilePath$25(TaskExecutor.java:1742) ~[flink-dist_2.11-1.11.0.jar:1.11.0] at java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1604) ~[?:1.8.0_262] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_262] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_262] at java.lang.Thread.run(Thread.java:748) ~[?:1.8.0_262] I guess "file" needs to be added in log4j.properties in the docker container e.g. log4j.rootLogger=INFO, file Are there any other properties which needs to be configured in any of the other property files or any jar needs to be added in the /opt/flink path ? Thanks Sidhant Gupta