Hi Renganathan,

Not quite. It strongly depends on your usage of UDFs defined in any
manner — as UDF object or just lambdas. If you have ones — they may and
will be called on executors too.

On 21/07/29 05:17, Renganathan Mutthiah wrote:
> Hi,
> 
> I have read in many materials (including from the book: Spark - The
> Definitive Guide) that Spark is a compiler.
> 
> In my understanding, our program is used until the point of DAG generation.
> This portion can be written in any language - Java,Scala,R,Python.
> Post that (executing the DAG), the engine runs in Scala only. This leads to
> Spark being called as a compiler.
> 
> If the above is true, we need to install R / Python only in the driver
> machine. R / Python run time is not needed in worker nodes. Am I correct ?
> 
> Thanks!

-- 
Regards,
Pasha

Big Data Tools @ JetBrains

Attachment: signature.asc
Description: PGP signature

Reply via email to