No it's much simpler than that. Spark is just a bunch of APIs that user
applications call into to cause it to form a DAG and execute it. There's no
need to reflection or transpiling or anything. The user app is just calling
the framework directly, not the other way around.

On Sun, Jan 3, 2021 at 4:49 AM Renganathan M <renganatha...@gmail.com>
wrote:

> Hi,
>
>
>
> I have read in many blogs that Spark framework is a compiler itself.
>
>
>
> It generates the DAG; optimizes it and executes it. The DAG is generated
> from the user submitted code ( be it in Java, Scala, Python or R). So when
> we submit a JAR file (it has the list of compiled classes), in the first
> step, does Spark use reflection to read the class files and then generates
> DAG ? I am not quit getting what really happens from the point where user
> submits the JAR file, to DAG generation.
>
>
>
> I tried looking for answers; but not able to get any.
>
>
>
> Can someone please help.
>
>
>
> Thanks!
>
>
>
> Sent from Mail <https://go.microsoft.com/fwlink/?LinkId=550986> for
> Windows 10
>
>
>

Reply via email to