Hi,
As far as I can tell the answer is unfortunately no. With Table API (SQL)
things are much simpler, as you have a restricted number of types of
columns that you need to support and you don't need to support arbitrary
Java classes as the records.
I'm shooting blindly here, but maybe you can use
Hello,
Currently, Flink already supports adding Python UDF and using that on Flink
Java job. It can be used on Table API. Can we do the same for creating
custom python function for Datastream transformation and use that on Flink
Java job?
Regards,
Jesry