Hello,

I am trying to understand the code base of spark-SQL, especially the Query
Analyzer part. I understand that currently (as of Spark 1.4), the sql
module generates a set of Physical plans, but executes only the first in
the list (ref : core/src/main/scala/org/apache/spark/sql/SQLContext.scala,
lines ~930).


I wish to visualize all the query plans generated and possibly execute them
all as different jobs. How should I proceed in order to accomplish this?

Thanks
Raajay

Reply via email to