Re: Best practice for adding support for Kafka variants

2021-06-03 Thread deepthi Sridharan
Makes sense. Thanks for the confirmation. On Thu, Jun 3, 2021, 4:08 AM Arvid Heise wrote: > Just to add, we target that for 1.14. > > However, it's also not too complicated to add a new TableFactory that uses > the new sources (or your source). > > On Thu, Jun 3, 2021 at 10:04 AM Chesnay Scheple

Re: Best practice for adding support for Kafka variants

2021-06-03 Thread Arvid Heise
Just to add, we target that for 1.14. However, it's also not too complicated to add a new TableFactory that uses the new sources (or your source). On Thu, Jun 3, 2021 at 10:04 AM Chesnay Schepler wrote: > The FLIP-27 were primarily aimed at the DataStream API; the integration > into the SQL/Tab

Re: Best practice for adding support for Kafka variants

2021-06-03 Thread Chesnay Schepler
The FLIP-27 were primarily aimed at the DataStream API; the integration into the SQL/Table APIs will happen at a later date. On 6/1/2021 5:59 PM, deepthi Sridharan wrote: Thank you, Roman. I should have said our own flavor of Kafka and not version. Thanks for the reference of the new source and

Re: Best practice for adding support for Kafka variants

2021-06-01 Thread deepthi Sridharan
Thank you, Roman. I should have said our own flavor of Kafka and not version. Thanks for the reference of the new source and sink interfaces, though, as it seems like the interfaces we should be implementing to use our custom Kafka connector. I did notice however that the FLIP does not cover table

Re: Best practice for adding support for Kafka variants

2021-05-20 Thread Roman Khachatryan
Hi, Those classes will likely be deprecated in the future in favor of FLIP-27 [1][2] source and FLIP-143 [3] sink implementations and eventually removed (though it won't happen soon). You probably should take a look at the above new APIs. Either way, there is no such a recommendation AFAIK. Copie

Best practice for adding support for Kafka variants

2021-05-20 Thread deepthi Sridharan
Hi, We have an internal version of Open source Kafka consumer and producer that we use and are working on adding that as a source and sink for flink. It seems like the easiest way to add the consumer as source would be to override the FlinkKafkaConsumer class's createFetcher