> As you probably know, Spark SQL generates custom Java code for the SQL
>> functions. You can use geometry.debugCodegen() to print out the generated
>> code.
>>
>>
>>
>> Shay
>>
>>
>>
>> *From:* Pablo Alcain
>> *Sent:* Tuesday, Ma
.debugCodegen() to print out the generated
> code.
>
>
>
> Shay
>
>
>
> *From:* Pablo Alcain
> *Sent:* Tuesday, May 3, 2022 6:07 AM
> *To:* user@spark.apache.org
> *Subject:* [EXTERNAL] Parse Execution Plan from PySpark
>
>
>
> *ATTENTION:* This email or
Hi Pablo,
As you probably know, Spark SQL generates custom Java code for the SQL
functions. You can use geometry.debugCodegen() to print out the generated code.
Shay
From: Pablo Alcain
Sent: Tuesday, May 3, 2022 6:07 AM
To: user@spark.apache.org
Subject: [EXTERNAL] Parse Execution Plan from
Hello all! I'm working with PySpark trying to reproduce some of the results
we see on batch through streaming processes, just as a PoC for now. For
this, I'm thinking of trying to interpret the execution plan and eventually
write it back to Python (I'm doing something similar with pandas as well,
a