not below min.isr setting of
> Kafka. I had the same problem and that was it for me.
>
> Frank
>
> Op za 9 apr. 2022 04:01 schreef Praneeth Ramesh :
>
>> Hi All
>>
>> I have a job which reads from kafka and applies some transactions and
>> writes the data b
astream/fault-tolerance/serialization/types_serialization/#special-types
I found out that it is supported in datastream at this documentation.
Is there a way to do this in Table API.
Thanks in advance for the help..
--
Regards
Praneeth Ramesh
/poc-flink-0.0.1.jar $FLINK_HOME/lib/poc-flink.jar`
So the same classloader is used.
And it solved the problem.
Wondering if any of you have tried bundling user code in $FLINK_HOME/usrlib in
application mode?
Regards
Praneeth Ramesh
On Mon, Sep 6, 2021 at 9:28 AM Arvid Heise wrote:
> This l
Hi All
I am trying to run a flink scala application which reads from kafka apply some
lookup transformations and then writes to kafka.
I am using Flink Version 1.12.1
I tested it in local and it works fine. But when I try to run it on cluster
using native kubernetes integration I see weird er