Spark 1.4.0 introduced built-in shutdown hooks that would shutdown
StreamingContext and SparkContext (similar to yours). If you are also
introducing your shutdown hook, I wonder whats the behavior going to be.

Try doing a jstack to see where the system is stuck. Alternatively, remove
your shutdown hook and see what happens.

On Thu, Sep 10, 2015 at 3:11 AM, Petr Novak <[email protected]> wrote:

> Hello,
> my Spark streaming v1.3.0 code uses
>
> sys.ShutdownHookThread {
>   ssc.stop(stopSparkContext = true, stopGracefully = true)
> }
>
> to use Ctrl+C in command line to stop it. It returned back to command line
> after it finished batch but it doesn't with v1.4.0-v.1.5.0. Was the
> behaviour or required code changed?
>
> The last messages are:
>
> [2015-09-08 13:02:43,300] INFO Waited for jobs to be processed and
> checkpoints to be written
> (org.apache.spark.streaming.scheduler.JobGenerator)
> [2015-09-08 13:02:43,300] INFO CheckpointWriter executor terminated ?
> true, waited for 0 ms. (org.apache.spark.streaming.CheckpointWriter)
> [2015-09-08 13:02:43,301] INFO Stopped JobGenerator
> (org.apache.spark.streaming.scheduler.JobGenerator)
> [2015-09-08 13:02:43,302] INFO Stopped JobScheduler
> (org.apache.spark.streaming.scheduler.JobScheduler)
> [2015-09-08 13:02:43,303] INFO stopped
> o.s.j.s.ServletContextHandler{/streaming,null}
> (org.spark-project.jetty.server.handler.ContextHandler)
> [2015-09-08 13:02:43,305] INFO stopped
> o.s.j.s.ServletContextHandler{/streaming/batch,null}
> (org.spark-project.jetty.server.handler.ContextHandler)
> [2015-09-08 13:02:43,307] INFO stopped
> o.s.j.s.ServletContextHandler{/static/streaming,null}
> (org.spark-project.jetty.server.handler.ContextHandler)
>
> Thank you for any explanation,
> Petr
>

Reply via email to