Hello experts,
My source is Kafka and I am trying to generate records for which I have
FlinkKafkaConsumer class.
Now my first question is how to consume an event timestamp for the records
generated.
I know for a fact that for CLI, there is one property called
*print.timestamp=true* which gives yo
Hi Yuxia,
Thank you so much for your response. Much appreciated. Here, by CDC I meant
the incremental changes that have to be pushed from Kafka to my processing
layer which is Flink.
Let me go through the links shared by you.
Sid.
On Mon, Jun 27, 2022 at 6:39 AM yuxia wrote:
> > I me
Hi team,
Any help here please?
Thanks,
Sid
On Sat, Jun 25, 2022 at 4:02 PM Sid wrote:
> Hello,
>
> I have a current flow where the data from the Flink-Kafka connector is
> captured and processed using Flink Datastream API and stored in Kafka
> topics. However, I would like
while capturing the data. I mean CDC should be handled on the Kafka side.
Or should I need to use Table API?
So, any ideas/links are much appreciated as I am trying to understand these
concepts.
TIA,
Sid
Hi Jonathan,
It would be better if you describe your scenario along with the code. It
would be easier for the community to help.
On Tue, 15 Feb 2022, 23:33 Jonathan Weaver, wrote:
> I'm getting the following exception running locally from my IDE (IntelliJ)
> but seems to not occur
> when runnin