No it's much simpler than that. Spark is just a bunch of APIs that user
applications call into to cause it to form a DAG and execute it. There's no
need to reflection or transpiling or anything. The user app is just calling
the framework directly, not the other way around.
On Sun, Jan 3, 2021 at 4
Hi,
I have read in many blogs that Spark framework is a compiler itself.
It generates the DAG; optimizes it and executes it. The DAG is generated from
the user submitted code ( be it in Java, Scala, Python or R). So when we submit
a JAR file (it has the list of compiled classes), in the first s