I have tried it on Spark 1.3.0 2.10 and it works. The same code doesn't on
Spark 1.5.0 2.11. It would be nice if anybody could try on another
installation to ensure it is something wrong on my cluster.

Many thanks,
Petr

On Fri, Sep 18, 2015 at 4:07 PM, Petr Novak <[email protected]> wrote:

> This one is generated, I suppose, after Ctrl+C
>
> 15/09/18 14:38:25 INFO Worker: Asked to kill executor
> app-20150918143823-0001/0
> 15/09/18 14:38:25 INFO Worker: Asked to kill executor
> app-20150918143823-0001/0
> 15/09/18 14:38:25 DEBUG
> AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled
> message (0.568753 ms) AkkaMessage(KillExecutor(#####,false) from
> Actor[akka://sparkWorker/deadLetters]
> 15/09/18 14:38:25 DEBUG
> AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled
> message (0.568753 ms) AkkaMessage(KillExecutor(#####,false) from
> Actor[akka://sparkWorker/deadLetters]
> 15/09/18 14:38:25 INFO ExecutorRunner: Runner thread for executor
> app-20150918143823-0001/0 interrupted
> 15/09/18 14:38:25 INFO ExecutorRunner: Runner thread for executor
> app-20150918143823-0001/0 interrupted
> 15/09/18 14:38:25 INFO ExecutorRunner: Killing process!
> 15/09/18 14:38:25 INFO ExecutorRunner: Killing process!
> 15/09/18 14:38:25 ERROR FileAppender: Error writing stream to file
> /dfs/spark/work/app-20150918143823-0001/0/stderr
> java.io.IOException: Stream closed
>     at
> java.io.BufferedInputStream.getBufIfOpen(BufferedInputStream.java:162)
>     at java.io.BufferedInputStream.read1(BufferedInputStream.java:272)
>     at java.io.BufferedInputStream.read(BufferedInputStream.java:334)
>     at java.io.FilterInputStream.read(FilterInputStream.java:107)
>     at
> org.apache.spark.util.logging.FileAppender.appendStreamToFile(FileAppender.scala:70)
>     at
> org.apache.spark.util.logging.FileAppender$$anon$1$$anonfun$run$1.apply$mcV$sp(FileAppender.scala:39)
>     at
> org.apache.spark.util.logging.FileAppender$$anon$1$$anonfun$run$1.apply(FileAppender.scala:39)
>     at
> org.apache.spark.util.logging.FileAppender$$anon$1$$anonfun$run$1.apply(FileAppender.scala:39)
>     at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1699)
>     at
> org.apache.spark.util.logging.FileAppender$$anon$1.run(FileAppender.scala:38)
> 15/09/18 14:38:25 ERROR FileAppender: Error writing stream to file
> /dfs/spark/work/app-20150918143823-0001/0/stderr
> java.io.IOException: Stream closed
>     at
> java.io.BufferedInputStream.getBufIfOpen(BufferedInputStream.java:162)
>     at java.io.BufferedInputStream.read1(BufferedInputStream.java:272)
>     at java.io.BufferedInputStream.read(BufferedInputStream.java:334)
>     at java.io.FilterInputStream.read(FilterInputStream.java:107)
>     at
> org.apache.spark.util.logging.FileAppender.appendStreamToFile(FileAppender.scala:70)
>     at
> org.apache.spark.util.logging.FileAppender$$anon$1$$anonfun$run$1.apply$mcV$sp(FileAppender.scala:39)
>     at
> org.apache.spark.util.logging.FileAppender$$anon$1$$anonfun$run$1.apply(FileAppender.scala:39)
>     at
> org.apache.spark.util.logging.FileAppender$$anon$1$$anonfun$run$1.apply(FileAppender.scala:39)
>     at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1699)
>     at
> org.apache.spark.util.logging.FileAppender$$anon$1.run(FileAppender.scala:38)
> 15/09/18 14:38:25 DEBUG FileAppender: Closed file
> /dfs/spark/work/app-20150918143823-0001/0/stderr
> 15/09/18 14:38:25 DEBUG FileAppender: Closed file
> /dfs/spark/work/app-20150918143823-0001/0/stderr
>
> Petr
>

Reply via email to