Hi Ranju!

You can configure Spark's metric system.

Check the *memoryMetrics.** of executor-metrics
<https://spark.apache.org/docs/3.0.0-preview/monitoring.html#executor-metrics>
and
in the component-instance-executor
<https://spark.apache.org/docs/3.0.0-preview/monitoring.html#component-instance--executor>
the
CPU times.

Regarding the details I suggest to check Luca Canali's presentations about
Spark's metric system and maybe his github repo
<https://github.com/LucaCanali/sparkMeasure>.

Best Regards,
Attila

On Sat, Mar 20, 2021 at 5:41 PM Mich Talebzadeh <mich.talebza...@gmail.com>
wrote:

> Hi,
>
> Have you considered spark GUI first?
>
>
>    view my Linkedin profile
> <https://www.linkedin.com/in/mich-talebzadeh-ph-d-5205b2/>
>
>
>
> *Disclaimer:* Use it at your own risk. Any and all responsibility for any
> loss, damage or destruction of data or any other property which may arise
> from relying on this email's technical content is explicitly disclaimed.
> The author will in no case be liable for any monetary damages arising from
> such loss, damage or destruction.
>
>
>
>
> On Sat, 20 Mar 2021 at 16:06, Ranju Jain <ranju.j...@ericsson.com.invalid>
> wrote:
>
>> Hi All,
>>
>>
>>
>> Virtual Machine running an application, this application is having
>> various other 3PPs components running such as spark, database etc .
>>
>>
>>
>> *My requirement is to monitor every component and isolate the resources
>> consuming individually by every component.*
>>
>>
>>
>> I am thinking of using a common tool such as Java Visual VM , where I
>> specify the JMX URL of every component and monitor every component.
>>
>>
>>
>> For other components I am able to view their resources.
>>
>>
>>
>> *Is there a possibility of Viewing the Spark Executor CPU/Memory via Java
>> Visual VM Tool?*
>>
>>
>>
>> Please guide.
>>
>>
>>
>> Regards
>>
>> Ranju
>>
>

Reply via email to