Thanks Shixiong!
To clarify for others, yes, I was speaking of the UI at port 4040, and I do
have event logging enabled, so I can review jobs after the fact. We hope to
upgrade our version of Spark soon, so I'll write back if that resolves it.

Sumona

On Mon, Feb 29, 2016 at 8:27 PM Sea <261810...@qq.com> wrote:

> Hi, Sumona:
>       It's a bug in Spark old version, In spark 1.6.0, it is fixed.
>       After the application complete, spark master will load event log to
> memory, and it is sync because of actor. If the event log is big, spark
> master will hang a long time, and you can not submit any applications, if
> your master memory is to small, you master will die!
>       The solution in spark 1.6 is not very good, the operation is async
> <https://www.baidu.com/link?url=x_WhMZLHfNnhHGknDAZ8Ssl9f7YlEQAvUgpLAGz6cI045umWecBzzh0ho-QkCr2nKnHOPJxIX5_n_zXe51k8z9hVuw4svP6dqWF0JrjabAa&wd=&eqid=be50a41600000f490000000256d50b7b>,
> and so you still need to set a big java heap for master.
>
>
>
> ------------------ 原始邮件 ------------------
> *发件人:* "Shixiong(Ryan) Zhu";<shixi...@databricks.com>;
> *发送时间:* 2016年3月1日(星期二) 上午8:02
> *收件人:* "Sumona Routh"<sumos...@gmail.com>;
> *抄送:* "user@spark.apache.org"<user@spark.apache.org>;
> *主题:* Re: Spark UI standalone "crashes" after an application finishes
>
> Do you mean you cannot access Master UI after your application completes?
> Could you check the master log?
>
> On Mon, Feb 29, 2016 at 3:48 PM, Sumona Routh <sumos...@gmail.com> wrote:
>
>> Hi there,
>> I've been doing some performance tuning of our Spark application, which
>> is using Spark 1.2.1 standalone. I have been using the spark metrics to
>> graph out details as I run the jobs, as well as the UI to review the tasks
>> and stages.
>>
>> I notice that after my application completes, or is near completion, the
>> UI "crashes." I get a Connection Refused response. Sometimes, the page
>> eventually recovers and will load again, but sometimes I end up having to
>> restart the Spark master to get it back. When I look at my graphs on the
>> app, the memory consumption (of driver, executors, and what I believe to be
>> the daemon (spark.jvm.total.used)) appears to be healthy. Monitoring the
>> master machine itself, memory and CPU appear healthy as well.
>>
>> Has anyone else seen this issue? Are there logs for the UI itself, and
>> where might I find those?
>>
>> Thanks!
>> Sumona
>>
>
>

Reply via email to