Re: When should we use flink-json instead of Jackson directly?

2022-10-28 Thread Yaroslav Tkachenko
Hey Vishal, I guess you're using the DataStream API? In this case, you have more control over data serialization, so it makes sense to use custom serialization logic. IMO, flink-json (as well as flink-avro, flink-csv, etc.) is really helpful when using Table API / SQL API, because it contains the

When should we use flink-json instead of Jackson directly?

2022-10-28 Thread Vishal Surana
I've been using Jackson to deserialize JSON messages into Scala classes and Java POJOs. The object mapper is heavily customized for our use cases. It seems that flink-json internally uses Jackson as well and allows for injecting our own mappers. Would there be any benefit of using flink-json instea

Re: [ANNOUNCE] Apache Flink 1.16.0 released

2022-10-28 Thread Jing Ge
Congrats! On Fri, Oct 28, 2022 at 1:22 PM 任庆盛 wrote: > Congratulations and a big thanks to Chesnay, Martijn, Godfrey and Xingbo > for the awesome work for 1.16! > > Best regards, > Qingsheng Ren > > > On Oct 28, 2022, at 14:46, Xingbo Huang wrote: > > > > The Apache Flink community is very hap

Re: [ANNOUNCE] Apache Flink 1.16.0 released

2022-10-28 Thread 任庆盛
Congratulations and a big thanks to Chesnay, Martijn, Godfrey and Xingbo for the awesome work for 1.16! Best regards, Qingsheng Ren > On Oct 28, 2022, at 14:46, Xingbo Huang wrote: > > The Apache Flink community is very happy to announce the release of Apache > Flink 1.16.0, which is the fi

Re: [ANNOUNCE] Apache Flink 1.16.0 released

2022-10-28 Thread 任庆盛
Congratulations and a big thanks to Chesnay, Martijn, Godfrey and Xingbo for the awesome work for 1.16! Best regards, Qingsheng Ren > On Oct 28, 2022, at 14:46, Xingbo Huang wrote: > > The Apache Flink community is very happy to announce the release of Apache > Flink 1.16.0, which is the fir

Performing left join between two streams

2022-10-28 Thread Surendra Lalwani via user
Hi Team, Is it possible in Flink to perform Left Outer join between two streams as it is possible in Spark. I checked for Internal join it only supports inner join as of now. Thanks and Regards , Surendra Lalwani -- IMPORTANT NOTICE: This e-mail, including any attachments, may contain confide