as Gourav said, the application UI on port 4040 will no more available
after your spark app finished. you should go to spark master's UI
(port 8080), and take a look "completed applications"...

refer to doc: http://spark.apache.org/docs/latest/monitoring.html
read the first "note that" :)

2016-03-01 21:13 GMT+01:00 Gourav Sengupta <gourav.sengu...@gmail.com>:
> Hi,
>
> in case you are submitting your SPARK jobs then the UI is only available
> when the job is running.
>
> Else if you are starting a SPARK cluster in standalone mode or HADOOP or
> etc, then the SPARK UI remains alive.
>
> The other way to keep the SPARK UI alive is to use the Jupyter notebook for
> Python or Scala (see Apache Toree) or use Zeppelin.
>
>
> Regards,
> Gourav Sengupta
>
> On Mon, Feb 29, 2016 at 11:48 PM, Sumona Routh <sumos...@gmail.com> wrote:
>>
>> Hi there,
>> I've been doing some performance tuning of our Spark application, which is
>> using Spark 1.2.1 standalone. I have been using the spark metrics to graph
>> out details as I run the jobs, as well as the UI to review the tasks and
>> stages.
>>
>> I notice that after my application completes, or is near completion, the
>> UI "crashes." I get a Connection Refused response. Sometimes, the page
>> eventually recovers and will load again, but sometimes I end up having to
>> restart the Spark master to get it back. When I look at my graphs on the
>> app, the memory consumption (of driver, executors, and what I believe to be
>> the daemon (spark.jvm.total.used)) appears to be healthy. Monitoring the
>> master machine itself, memory and CPU appear healthy as well.
>>
>> Has anyone else seen this issue? Are there logs for the UI itself, and
>> where might I find those?
>>
>> Thanks!
>> Sumona
>
>

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to