Still waiting for response, any clue/suggestions?
On Tue, Aug 16, 2016 at 4:48 PM, Muhammad Haris <
muhammad.haris.makh...@gmail.com> wrote:
> Hi,
> I have been trying to collect driver, master, worker and executors metrics
> using Spark 2.0 in standalone mode, here is
Hi,
I have been trying to collect driver, master, worker and executors metrics
using Spark 2.0 in standalone mode, here is what my metrics configuration
file looks like:
*.sink.csv.class=org.apache.spark.metrics.sink.CsvSink
*.sink.csv.period=1
*.sink.csv.unit=seconds
*.sink.csv.directory=/root/me
Hi,
Could anybody please guide me how to get application or job level counters
for CPU and Memory in Spark 2.0.0 using REST API.
I have explored the API's at
http://spark.apache.org/docs/latest/monitoring.html
but did not find anything similar to what MR provides, see the link below:
(
http://hadoo