Hi guys,

At the moment, I want to integrate Spark SQL execution to Apache Atlas to
show the lineage:

`source_table` --SQL statement--`target_table`

I have tried SAC [https://github.com/hortonworks-spark/spark-atlas-connector]
but it does not support Spark 3.x. And SAC lacks SQL statements for Atlas
processes, only have Plans.

My approach is create a Listener with QueryExecutionListener and I have two
questions
- How can we get SQL statements from a QueryExecution object?
- If we can not, whether we can track SQL statements
by QueryPlanningTracker?

Thanks for your help!


Best wish

------------------------------------------------------
Nguyễn Đình Thắng

Reply via email to