Hi Sam,
You might want to have a look at spark UI which runs by default at
localhost://8080. You can also configure Apache Ganglia to monitor over your
cluster resources.
Thank you
Regards
Himanshu Mehra
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Mon
Hi Sam,
Have a look at Sematext's SPM for your Spark monitoring needs. If the
problem is CPU, IO, Network, etc. as Ahkil mentioned, you'll see that in
SPM, too.
As for the number of jobs running, you have see a chart with that at
http://sematext.com/spm/integrations/spark-monitoring.html
Otis
--
It could be a CPU, IO, Network bottleneck, you need to figure out where
exactly its chocking. You can use certain monitoring utilities (like top)
to understand it better.
Thanks
Best Regards
On Sun, Jun 7, 2015 at 4:07 PM, SamyaMaiti
wrote:
> Hi All,
>
> I have a Spark SQL application to fetch