> As you probably know, Spark SQL generates custom Java code for the SQL
>> functions. You can use geometry.debugCodegen() to print out the generated
>> code.
>>
>>
>>
>> Shay
>>
>>
>>
>> *From:* Pablo Alcain
>> *Sent:* Tuesday, Ma
Hello all! I'm working with PySpark trying to reproduce some of the results
we see on batch through streaming processes, just as a PoC for now. For
this, I'm thinking of trying to interpret the execution plan and eventually
write it back to Python (I'm doing something similar with pandas as well,
a