I am trying to create a function that reads data from Kafka, communicates
with confluent schema registry and decodes avro data with evolving schemas.
I am trying to not create hack-ish patches and to write proper code that I
could maybe even create pull requests for. looking at the code I have been
able to figure out a few things regarding how expressions are generated and
how they help to accomplish what a function does, but there is still a ton
I just cannot wrap my head around.

I am unable to find any documentation that gets into such nitty gritties of
Spark. *I am writing in hopes to find some help. Do you have any
documentation that explains how a function
(org.apache.spark.sql.function._) is turned into a logical plan?*

Reply via email to