radekaadek commented on issue #2402:
URL: https://github.com/apache/sedona/issues/2402#issuecomment-3430781460

   I use the Apache Flink through a custom Docker image that I base on of the 
[official image](https://hub.docker.com/_/flink) to which I add jar 
dependencies for the connectors and plugins I need to use. I then spin up this 
custom image and I interface with it through  the [SQL 
Gateway](https://nightlies.apache.org/flink/flink-docs-master/docs/dev/table/sql-gateway/overview/#starting-the-sql-gateway)'s
 REST API.
   
   
   I had been looking for a solution to the problem and stumbled upon 
[modules](https://nightlies.apache.org/flink/flink-docs-master/docs/dev/table/modules/)
 in Flink and the documentation for them explicitly states:
   
   > For example, users can define their own geo functions and plug them into 
Flink as built-in functions to be used in Flink SQL and Table APIs.
   
   As the documentation [later 
states](https://nightlies.apache.org/flink/flink-docs-master/docs/dev/table/modules/#how-to-load-unload-use-and-list-modules),
 these modules can then loaded by calling `LOAD MODULE` in the SQL Client.
   
   This seems very promising for me but the documentation doesn't state whether 
custom types/serializers can be loaded. However I found some documentation for 
third-party serializers 
[here](https://nightlies.apache.org/flink/flink-docs-master/docs/dev/datastream/fault-tolerance/serialization/third_party_serializers/).
   I'm not very familiar with what Sedona needs in order to work correctly and 
to register its types but I suspect that the documentation on third-party 
serializers can help us.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to