Hi AbdealiJK,
In order to get AST you can parse your query with Spark Parser :
LogicalPlan logicalPlan =
sparkSession.sessionState().sqlParser().parsePlan("select * from
myTable");
Afterwards you can implement your custom logic and execute it in this way:
Dataset ds = Dataset.ofRows(sparkSessio
I was writing some code to try to auto find a list of tables and databases
being used in a SparkSQL query. Mainly I was looking to auto-check the
permissions and owners of all the tables a query will be trying to access.
I was wondering whether PySpark has some method for me to directly use the
AS