Hi

I need the same through Java.
Doesn't the SPark API support this?

On Wed, Sep 17, 2014 at 2:48 AM, Akhil Das <ak...@sigmoidanalytics.com>
wrote:

> Ganglia does give you a cluster wide and per machine utilization of
> resources, but i don't think it gives your per Spark Job. If you want to
> build something from scratch then you can follow up like :
>
> 1. Login to the machine
> 2. Get the PIDs
> 3. For network IO per process, you can have a look at
> http://nethogs.sourceforge.net/
> 4. You can make use of the information in /proc/[pid]/stat and /proc/stat
> to estimate CPU usage and all
>
>
> Similarly you can get any metric of process once you have the PID.
>
>
> Thanks
> Best Regards
>
> On Wed, Sep 17, 2014 at 8:59 AM, VJ Shalish <vjshal...@gmail.com> wrote:
>
>> Sorry for the confusion Team.
>> My requirement is to measure the CPU utilisation, RAM usage, Network IO
>> and other metrics of a SPARK JOB using Java program.
>> Please help on the same.
>>
>> On Tue, Sep 16, 2014 at 11:23 PM, Amit <kumarami...@gmail.com> wrote:
>>
>>> Not particularly related to Spark, but you can check out SIGAR API. It
>>> let's you get CPU, Memory, Network, Filesystem and process based metrics.
>>>
>>> Amit
>>> On Sep 16, 2014, at 20:14, VJ Shalish <vjshal...@gmail.com> wrote:
>>>
>>> > Hi
>>> >
>>> > I need to get the CPU utilisation, RAM usage, Network IO and other
>>> metrics using Java program. Can anyone help me on this?
>>> >
>>> > Thanks
>>> > Shalish.
>>>
>>
>>
>

Reply via email to