Ryan Brideau created FLINK-8103: ----------------------------------- Summary: Flink 1.4 not writing to standard out log file Key: FLINK-8103 URL: https://issues.apache.org/jira/browse/FLINK-8103 Project: Flink Issue Type: Bug Components: Core Affects Versions: 1.4.0 Environment: macOS 10.13 (High Sierra) Reporter: Ryan Brideau
I built the latest snapshot of 1.4 yesterday and tries testing it with a simple word count example, where StreamUtil is just a helper than checks input parameters: {code:scala} import org.apache.flink.api.java.utils.ParameterTool import org.apache.flink.streaming.api.scala._ object Words { def main(args: Array[String]) { // set up the execution environment val env = StreamExecutionEnvironment.getExecutionEnvironment val params = ParameterTool.fromArgs(args) env.getConfig.setGlobalJobParameters(params) val dataStream = StreamUtil.getDataStream(env, params) val wordDataStream = dataStream .flatMap{ _.split(" ") } wordDataStream.println // execute program env.execute("Words Scala") } } {code} This runs without an issue on the latest stable version of 1.3 and writes its results to the _out_ file, which I can tail to see the results. This doesn't happen in 1.4, however. I can modify it to write out to a file, however: {code:scala} import org.apache.flink.api.java.utils.ParameterTool import org.apache.flink.core.fs.FileSystem.WriteMode import org.apache.flink.streaming.api.scala._ object Words { def main(args: Array[String]) { // set up the execution environment val env = StreamExecutionEnvironment.getExecutionEnvironment val params = ParameterTool.fromArgs(args) env.getConfig.setGlobalJobParameters(params) val dataStream = StreamUtil.getDataStream(env, params) val wordDataStream = dataStream .flatMap{ _.split(" ") } wordDataStream .writeAsText("file:///somepath/output", WriteMode.OVERWRITE) .setParallelism(1) // execute program env.execute("Words Scala") } } {code} Any clues as to what might be causing this? -- This message was sent by Atlassian JIRA (v6.4.14#64029)