Hi Michal,
yes, it is there logged twice, it can be seen in attached log in one of
previous post with more details:

15/09/17 23:06:37 INFO StreamingContext: Invoking
stop(stopGracefully=false) from shutdown hook
15/09/17 23:06:37 INFO StreamingContext: Invoking
stop(stopGracefully=false) from shutdown hook

Thanks,
Petr

On Sat, Sep 19, 2015 at 4:01 AM, Michal Čizmazia <mici...@gmail.com> wrote:

> Hi Petr, after Ctrl+C can you see the following message in the logs?
>
> Invoking stop(stopGracefully=false)
>
> Details:
> https://github.com/apache/spark/pull/6307
>
>
> On 18 September 2015 at 10:28, Petr Novak <oss.mli...@gmail.com> wrote:
>
>> It might be connected with my problems with gracefulShutdown in Spark
>> 1.5.0 2.11
>> https://mail.google.com/mail/#search/petr/14fb6bd5166f9395
>>
>> Maybe Ctrl+C corrupts checkpoints and breaks gracefulShutdown?
>>
>> Petr
>>
>> On Fri, Sep 18, 2015 at 4:10 PM, Petr Novak <oss.mli...@gmail.com> wrote:
>>
>>> ...to ensure it is not something wrong on my cluster.
>>>
>>> On Fri, Sep 18, 2015 at 4:09 PM, Petr Novak <oss.mli...@gmail.com>
>>> wrote:
>>>
>>>> I have tried it on Spark 1.3.0 2.10 and it works. The same code doesn't
>>>> on Spark 1.5.0 2.11. It would be nice if anybody could try on another
>>>> installation to ensure it is something wrong on my cluster.
>>>>
>>>> Many thanks,
>>>> Petr
>>>>
>>>> On Fri, Sep 18, 2015 at 4:07 PM, Petr Novak <oss.mli...@gmail.com>
>>>> wrote:
>>>>
>>>>> This one is generated, I suppose, after Ctrl+C
>>>>>
>>>>> 15/09/18 14:38:25 INFO Worker: Asked to kill executor
>>>>> app-20150918143823-0001/0
>>>>> 15/09/18 14:38:25 INFO Worker: Asked to kill executor
>>>>> app-20150918143823-0001/0
>>>>> 15/09/18 14:38:25 DEBUG
>>>>> AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled
>>>>> message (0.568753 ms) AkkaMessage(KillExecutor(#####,false) from
>>>>> Actor[akka://sparkWorker/deadLetters]
>>>>> 15/09/18 14:38:25 DEBUG
>>>>> AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled
>>>>> message (0.568753 ms) AkkaMessage(KillExecutor(#####,false) from
>>>>> Actor[akka://sparkWorker/deadLetters]
>>>>> 15/09/18 14:38:25 INFO ExecutorRunner: Runner thread for executor
>>>>> app-20150918143823-0001/0 interrupted
>>>>> 15/09/18 14:38:25 INFO ExecutorRunner: Runner thread for executor
>>>>> app-20150918143823-0001/0 interrupted
>>>>> 15/09/18 14:38:25 INFO ExecutorRunner: Killing process!
>>>>> 15/09/18 14:38:25 INFO ExecutorRunner: Killing process!
>>>>> 15/09/18 14:38:25 ERROR FileAppender: Error writing stream to file
>>>>> /dfs/spark/work/app-20150918143823-0001/0/stderr
>>>>> java.io.IOException: Stream closed
>>>>>     at
>>>>> java.io.BufferedInputStream.getBufIfOpen(BufferedInputStream.java:162)
>>>>>     at java.io.BufferedInputStream.read1(BufferedInputStream.java:272)
>>>>>     at java.io.BufferedInputStream.read(BufferedInputStream.java:334)
>>>>>     at java.io.FilterInputStream.read(FilterInputStream.java:107)
>>>>>     at
>>>>> org.apache.spark.util.logging.FileAppender.appendStreamToFile(FileAppender.scala:70)
>>>>>     at
>>>>> org.apache.spark.util.logging.FileAppender$$anon$1$$anonfun$run$1.apply$mcV$sp(FileAppender.scala:39)
>>>>>     at
>>>>> org.apache.spark.util.logging.FileAppender$$anon$1$$anonfun$run$1.apply(FileAppender.scala:39)
>>>>>     at
>>>>> org.apache.spark.util.logging.FileAppender$$anon$1$$anonfun$run$1.apply(FileAppender.scala:39)
>>>>>     at
>>>>> org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1699)
>>>>>     at
>>>>> org.apache.spark.util.logging.FileAppender$$anon$1.run(FileAppender.scala:38)
>>>>> 15/09/18 14:38:25 ERROR FileAppender: Error writing stream to file
>>>>> /dfs/spark/work/app-20150918143823-0001/0/stderr
>>>>> java.io.IOException: Stream closed
>>>>>     at
>>>>> java.io.BufferedInputStream.getBufIfOpen(BufferedInputStream.java:162)
>>>>>     at java.io.BufferedInputStream.read1(BufferedInputStream.java:272)
>>>>>     at java.io.BufferedInputStream.read(BufferedInputStream.java:334)
>>>>>     at java.io.FilterInputStream.read(FilterInputStream.java:107)
>>>>>     at
>>>>> org.apache.spark.util.logging.FileAppender.appendStreamToFile(FileAppender.scala:70)
>>>>>     at
>>>>> org.apache.spark.util.logging.FileAppender$$anon$1$$anonfun$run$1.apply$mcV$sp(FileAppender.scala:39)
>>>>>     at
>>>>> org.apache.spark.util.logging.FileAppender$$anon$1$$anonfun$run$1.apply(FileAppender.scala:39)
>>>>>     at
>>>>> org.apache.spark.util.logging.FileAppender$$anon$1$$anonfun$run$1.apply(FileAppender.scala:39)
>>>>>     at
>>>>> org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1699)
>>>>>     at
>>>>> org.apache.spark.util.logging.FileAppender$$anon$1.run(FileAppender.scala:38)
>>>>> 15/09/18 14:38:25 DEBUG FileAppender: Closed file
>>>>> /dfs/spark/work/app-20150918143823-0001/0/stderr
>>>>> 15/09/18 14:38:25 DEBUG FileAppender: Closed file
>>>>> /dfs/spark/work/app-20150918143823-0001/0/stderr
>>>>>
>>>>> Petr
>>>>>
>>>>
>>>>
>>>
>>
>

Reply via email to