Hi,

I have read in many blogs that Spark framework is a compiler itself.

It generates the DAG; optimizes it and executes it. The DAG is generated from 
the user submitted code ( be it in Java, Scala, Python or R). So when we submit 
a JAR file (it has the list of compiled classes), in the first step, does Spark 
use reflection to read the class files and then generates DAG ? I am not quit 
getting what really happens from the point where user submits the JAR file, to 
DAG generation.

I tried looking for answers; but not able to get any.

Can someone please help.

Thanks!

Sent from Mail for Windows 10

Reply via email to