Do you have event logs enabled?
Streaming + event logs enabled  can hang master -
https://issues.apache.org/jira/browse/SPARK-6270

On Thu, Sep 17, 2015 at 11:35 PM, ZhuGe <t...@outlook.com> wrote:

> Hi there:
> we recently deploy a streaming application in our stand alone cluster. And
> we found a issue when we trying to stop the streaming sc(has been working
> for several days)with the kill command in the spark ui.
> By kill command, i mean the 'kill' button in the "Submission ID" column
> of  "Running Drivers" table.
> It would cause the master hung,  the ui could not work any more as all the
> request would time out.
> I use jstat to print the gc info of the mater. there is 3-5 young gc per
> sec.
>
> Log is attached below:
> [INFO 2015-09-18 12:26:41 (Logging.scala:59)] Asked to kill driver
> driver-20150916172518-0002
> [INFO 2015-09-18 12:26:41 (Logging.scala:59)] Kill request for
> driver-20150916172518-0002 submitted
> [INFO 2015-09-18 12:26:43 (Logging.scala:59)] Received unregister request
> from application app-20150916172521-0055
> [INFO 2015-09-18 12:26:43 (Logging.scala:59)] Removing app
> app-20150916172521-0055
> [WARN 2015-09-18 12:26:43 (Logging.scala:71)] Application
> GuessitDirectKafkaStreaming is still in progress, it may be terminated
> abnormally.
> [INFO 2015-09-18 12:26:43 (Logging.scala:59)] Changing view acls to:
> spdcadmin
> [INFO 2015-09-18 12:26:43 (Logging.scala:59)] Changing modify acls to:
> spdcadmin
> [INFO 2015-09-18 12:26:43 (Logging.scala:59)] SecurityManager:
> authentication disabled; ui acls disabled; users with view permissions:
> Set(spdcadmin); users with modify permissions: Set(spdcadmin)
> [WARN 2015-09-18 12:28:46 (Logging.scala:92)] GET / failed:
> java.util.concurrent.TimeoutException: Futures timed out after [120 seconds]
> java.util.concurrent.TimeoutException: Futures timed out after [120
> seconds]
>         at
> scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
>         at
> scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
>         at
> scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
>         at
> scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
>         at scala.concurrent.Await$.result(package.scala:107)
>         at
> org.apache.spark.deploy.master.ui.MasterPage.getMasterState(MasterPage.scala:40)
>         at
> org.apache.spark.deploy.master.ui.MasterPage.render(MasterPage.scala:74)
>         at org.apache.spark.ui.WebUI$$anonfun$2.apply(WebUI.scala:79)
>         at org.apache.spark.ui.WebUI$$anonfun$2.apply(WebUI.scala:79)
>         at
> org.apache.spark.ui.JettyUtils$$anon$1.doGet(JettyUtils.scala:69)
>         at javax.servlet.http.HttpServlet.service(HttpServlet.java:735)
>         at javax.servlet.http.HttpServlet.service(HttpServlet.java:848)
>         at
> org.spark-project.jetty.servlet.ServletHolder.handle(ServletHolder.java:684)
>         at
> org.spark-project.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:501)
>         at
> org.spark-project.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1086)
>         at
> org.spark-project.jetty.servlet.ServletHandler.doScope(ServletHandler.java:428)
>         at
> org.spark-project.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1020)
>         at
> org.spark-project.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:135)
>         at
> org.spark-project.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:255)
>         at
> org.spark-project.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:116)
>         at org.spark-project.jetty.server.Server.handle(Server.java:370)
>         at
> org.spark-project.jetty.server.AbstractHttpConnection.handleRequest(AbstractHttpConnection.java:494)
>         at
> org.spark-project.jetty.server.AbstractHttpConnection.headerComplete(AbstractHttpConnection.java:971)
>         at
> org.spark-project.jetty.server.AbstractHttpConnection$RequestHandler.headerComplete(AbstractHttpConnection.java:1033)
>         at
> org.spark-project.jetty.http.HttpParser.parseNext(HttpParser.java:644)
>         at
> org.spark-project.jetty.http.HttpParser.parseAvailable(HttpParser.java:235)
>         at
> org.spark-project.jetty.server.AsyncHttpConnection.handle(AsyncHttpConnection.java:82)
>         at
> org.spark-project.jetty.io.nio.SelectChannelEndPoint.handle(SelectChannelEndPoint.java:667)
>         at
> org.spark-project.jetty.io.nio.SelectChannelEndPoint$1.run(SelectChannelEndPoint.java:52)
>         at
> org.spark-project.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:608)
>         at
> org.spark-project.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:543)
>         at java.lang.Thread.run(Thread.java:745)
>
>
>
>
> We stuck in this situation several times and could not find any reference
> about this.
> Any help would be appreciated!
>
> Cheers
> Ge Zhu
>

Reply via email to