Hi Wenchen Fan:
thanks for reply. in the link, i saw sql metrics which is very userful.
```
SQL metrics
The metrics of SQL operators are shown in the block of physical operators. The
SQL metrics can be useful when we want to dive into the execution details of
each operator. For example, “number of output rows” can answer how many rows
are output after a Filter operator, “shuffle bytes written total” in an
Exchange operator shows the number of bytes written by a shuffle.
Here is the list of SQL metrics:
````
my question is except reading these metrics in the spark web ui., is there
any way to read the metrics in driver side by code?
Best regards
Kelly Zhang
At 2020-04-30 21:38:56, "Wenchen Fan" <cloud0...@gmail.com> wrote:
Does the Spark SQL web UI work for you?
https://spark.apache.org/docs/3.0.0-preview/web-ui.html#sql-tab
On Thu, Apr 30, 2020 at 5:30 PM Manu Zhang <owenzhang1...@gmail.com> wrote:
Hi Kelly,
If you can parse event log, then try listening on
`SparkListenerSQLExecutionStart` event and build a `SparkPlanGraph` like
https://github.com/apache/spark/blob/master/sql/core/src/main/scala/org/apache/spark/sql/execution/ui/SQLAppStatusListener.scala#L306.
`SparkPlanGraph` has a `makeDotFile` method where you can write out a `.dot`
file and visualize it with Graphviz tools, e.g. http://www.webgraphviz.com/
Thanks,
Manu
On Thu, Apr 30, 2020 at 3:21 PM zhangliyun <kelly...@126.com> wrote:
Hi all
i want to ask a question is there any tool to visualize the spark physical
plan or spark plan? sometimes the physical plan is very long so it is difficult
to view it.
Best Regards
KellyZhang