Hi,

it seems that you are not subscribed to our mailing list, so I had to
manually accept your mail. Would be good if you could subscribe.

Can you send us also the log output of the JobManager?
If your YARN cluster has log aggregation activated, you can retrieve the
logs of a stopped YARN session using:
yarn logs -applicationId <AppId>

watch out for the jobmanager-main.log or so file.

I suspect that there has been an exception on the JobManager.

Best,
Robert



On Wed, Jan 28, 2015 at 12:01 PM, Bruecke, Christoph <
christoph.brue...@campus.tu-berlin.de> wrote:

> Hi,
>
> I have written a job that reads a SequenceFile from HDFS using the
> Hadoop-Compatibility add-on. Doing so results in a TimeoutException. I’m
> using flink-0.9-SNAPSHOT with PR 342 (
> https://github.com/apache/flink/pull/342 ). Furthermore I’m running flink
> on yarn with two TM using flink-yarn-0.9-SNAPSHOT/bin/yarn-session.sh -n 2.
>
> Is this a bug or is there something wrong with the configuration?
>
> 01/28/2015 11:42:52     Job execution switched to status RUNNING.
> 01/28/2015 11:42:52     CHAIN DataSource (at
> createInput(ExecutionEnvironment.java:426)
> (org.apache.flink.hadoopcompatibility.mapreduce.HadoopInputFormat)) ->
> FlatMap (FlatMap at main(ThiaziParser.java:37))(1/1) switched to SCHEDULED
> 01/28/2015 11:42:52     CHAIN DataSource (at
> createInput(ExecutionEnvironment.java:426)
> (org.apache.flink.hadoopcompatibility.mapreduce.HadoopInputFormat)) ->
> FlatMap (FlatMap at main(ThiaziParser.java:37))(1/1) switched to DEPLOYING
> 01/28/2015 11:42:52     CHAIN DataSource (at
> createInput(ExecutionEnvironment.java:426)
> (org.apache.flink.hadoopcompatibility.mapreduce.HadoopInputFormat)) ->
> FlatMap (FlatMap at main(ThiaziParser.java:37))(1/1) switched to RUNNING
> 01/28/2015 11:44:32     CHAIN DataSource (at
> createInput(ExecutionEnvironment.java:426)
> (org.apache.flink.hadoopcompatibility.mapreduce.HadoopInputFormat)) ->
> FlatMap (FlatMap at main(ThiaziParser.java:37))(1/1) switched to FAILED
> java.lang.RuntimeException: Requesting the next InputSplit failed.
>         at
> org.apache.flink.runtime.taskmanager.TaskInputSplitProvider.getNextInputSplit(TaskInputSplitProvider.java:63)
>         at
> org.apache.flink.runtime.operators.DataSourceTask$1.hasNext(DataSourceTask.java:355)
>         at
> org.apache.flink.runtime.operators.DataSourceTask.invoke(DataSourceTask.java:154)
>         at
> org.apache.flink.runtime.execution.RuntimeEnvironment.run(RuntimeEnvironment.java:204)
>         at java.lang.Thread.run(Thread.java:745)
> Caused by: java.util.concurrent.TimeoutException: Futures timed out after
> [100 seconds]
>         at
> scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
>         at
> scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
>         at
> scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
>         at
> scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
>         at scala.concurrent.Await$.result(package.scala:107)
>         at
> org.apache.flink.runtime.akka.AkkaUtils$.ask(AkkaUtils.scala:265)
>         at org.apache.flink.runtime.akka.AkkaUtils.ask(AkkaUtils.scala)
>         at
> org.apache.flink.runtime.taskmanager.TaskInputSplitProvider.getNextInputSplit(TaskInputSplitProvider.java:56)
>         ... 4 more
>
> 01/28/2015 11:44:32     Job execution switched to status FAILING.
> 01/28/2015 11:44:32     GroupReduce (GroupReduce at
> main(ThiaziParser.java:40))(1/1) switched to CANCELED
> 01/28/2015 11:44:32     DataSink(TextOutputFormat (hdfs://
> cloud-11.dima.tu-berlin.de:60010/user/cbruecke/output/thiazi-seq/authors)
> - UTF-8)(1/1) switched to CANCELED
> 01/28/2015 11:44:32     CHAIN GroupReduce (GroupReduce at
> main(ThiaziParser.java:74)) -> Filter (Filter at
> main(ThiaziParser.java:97))(1/1) switched to CANCELED
> 01/28/2015 11:44:32     DataSink(TextOutputFormat (hdfs://
> cloud-11.dima.tu-berlin.de:60010/user/cbruecke/output/thiazi-seq/posts) -
> UTF-8)(1/1) switched to CANCELED
> 01/28/2015 11:44:32     CHAIN FlatMap (FlatMap at
> main(ThiaziParser.java:126)) -> Combine(SUM(1), at
> main(ThiaziParser.java:140)(1/1) switched to CANCELED
> 01/28/2015 11:44:32     Reduce (SUM(1), at
> main(ThiaziParser.java:140)(1/1) switched to CANCELED
> 01/28/2015 11:44:32     DataSink(CsvOutputFormat (path: hdfs://
> cloud-11.dima.tu-berlin.de:60010/user/cbruecke/output/thiazi-seq/wordcount,
> delimiter: ,))(1/1) switched to CANCELED
> 01/28/2015 11:44:32     GroupReduce (GroupReduce at
> main(ThiaziParser.java:106))(1/1) switched to CANCELED
> 01/28/2015 11:44:32     DataSink(TextOutputFormat (hdfs://
> cloud-11.dima.tu-berlin.de:60010/user/cbruecke/output/thiazi-seq/threads)
> - UTF-8)(1/1) switched to CANCELED
> 01/28/2015 11:44:32     Job execution switched to status FAILED.
> Error: The program execution failed: java.lang.RuntimeException:
> Requesting the next InputSplit failed.
>         at
> org.apache.flink.runtime.taskmanager.TaskInputSplitProvider.getNextInputSplit(TaskInputSplitProvider.java:63)
>         at
> org.apache.flink.runtime.operators.DataSourceTask$1.hasNext(DataSourceTask.java:355)
>         at
> org.apache.flink.runtime.operators.DataSourceTask.invoke(DataSourceTask.java:154)
>         at
> org.apache.flink.runtime.execution.RuntimeEnvironment.run(RuntimeEnvironment.java:204)
>         at java.lang.Thread.run(Thread.java:745)
> Caused by: java.util.concurrent.TimeoutException: Futures timed out after
> [100 seconds]
>         at
> scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
>         at
> scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
>         at
> scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
>         at
> scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
>         at scala.concurrent.Await$.result(package.scala:107)
>         at
> org.apache.flink.runtime.akka.AkkaUtils$.ask(AkkaUtils.scala:265)
>         at org.apache.flink.runtime.akka.AkkaUtils.ask(AkkaUtils.scala)
>         at
> org.apache.flink.runtime.taskmanager.TaskInputSplitProvider.getNextInputSplit(TaskInputSplitProvider.java:56)
>         ... 4 more
>
>
>

Reply via email to