Re: Documentation on org.apache.spark.sql.functions backend.

2019-09-16 Thread Marco Gaido
Hi Vipul, I am afraid I cannot help you on that. Thanks, Marco Il giorno lun 16 set 2019 alle ore 10:44 Vipul Rajan ha scritto: > Hi Marco, > > That does help. Thanks, for taking the time. I am confused as to how that > Expression is created. There are methods like eval, nullSafeEval, > doGenC

Re: Documentation on org.apache.spark.sql.functions backend.

2019-09-16 Thread Vipul Rajan
Hi Marco, That does help. Thanks, for taking the time. I am confused as to how that Expression is created. There are methods like eval, nullSafeEval, doGenCode. Aren't there any architectural docs that could help with what is exactly happening? Reverse engineering seems a bit daunting. Regards O

Re: Documentation on org.apache.spark.sql.functions backend.

2019-09-16 Thread Marco Gaido
Hi Vipul, a function is never turned in a logical plan. A function is turned into an Expression. And an Expression can be part of many Logical or Physical Plans. Hope this helps. Thanks, Marco Il giorno lun 16 set 2019 alle ore 08:27 Vipul Rajan ha scritto: > I am trying to create a function t

Documentation on org.apache.spark.sql.functions backend.

2019-09-15 Thread Vipul Rajan
I am trying to create a function that reads data from Kafka, communicates with confluent schema registry and decodes avro data with evolving schemas. I am trying to not create hack-ish patches and to write proper code that I could maybe even create pull requests for. looking at the code I have been