Well you can do a fair bit with the available tools

The Spark UI, particularly the Staging and Executors tabs, do provide some
valuable insights related to database health metrics for applications using
a JDBC source.

Stage Overview:

This section provides a summary of all the stages executed during the
application's lifetime. It includes details such as the stage ID,
description, submission time, duration, and number of tasks.
Each Stage represents a set of tasks that perform the same computation,
typically applied to a partition of the input data. The Stages tab offers
insights into how these stages are executed and their associated metrics.
This tab may include a directed acyclic graph (DAG) visualization,
illustrating the logical and physical execution plan of the Spark
application.

Executors Tab:

The Executors tab provides detailed information about the executors running
in the Spark application. Executors are responsible for executing tasks on
behalf of the Spark application. The "Executors" tab offers insights into
the current state and resource usage of each executor.

In addition, the underlying database will have some instrumentation to
assist you with your work. say with Oracle (as an example), utilise tools
like OEM, VM StatPack, SQL*Plus scripts etc or third-party monitoring tools
to collect detailed database health metrics directly from the Oracle
database server.

HTH

Mich Talebzadeh,
Technologist | Solutions Architect | Data Engineer  | Generative AI
London
United Kingdom


   view my Linkedin profile
<https://www.linkedin.com/in/mich-talebzadeh-ph-d-5205b2/>


 https://en.everybodywiki.com/Mich_Talebzadeh



*Disclaimer:* The information provided is correct to the best of my
knowledge but of course cannot be guaranteed . It is essential to note
that, as with any advice, quote "one test result is worth one-thousand
expert opinions (Werner  <https://en.wikipedia.org/wiki/Wernher_von_Braun>Von
Braun <https://en.wikipedia.org/wiki/Wernher_von_Braun>)".


On Mon, 8 Apr 2024 at 19:35, casel.chen <casel_c...@126.com> wrote:

> Hello, I have a spark application with jdbc source and do some
> calculation.
> To monitor application healthy, I need db related metrics per database
> like number of connections, sql execution time and sql fired time
> distribution etc.
> Does anybody know how to get them? Thanks!
>
>

Reply via email to