I was writing some code to try to auto find a list of tables and databases
being used in a SparkSQL query. Mainly I was looking to auto-check the
permissions and owners of all the tables a query will be trying to access.

I was wondering whether PySpark has some method for me to directly use the
AST that SparkSQL uses?

Or is there some documentation on how I can generate and understand the AST
in Spark?

Regards,
AbdealiJK

Reply via email to