ransactional protocol
upgrade. Are there any contradictions in making the public
ConsumerGroupMetadata(String
groupId) constructor deprecated too? It would have given us a clear hint
that we were still doing something wrong from the start.
--
Pozdrawiam
Paweł Szymczyk
tedDeviceSerde()));
which means that I like to change the sink topic Serde on the fly without
mapping the objects manually.
śr., 26 lut 2025 o 07:46 Paweł Szymczyk
napisał(a):
> I will provide you with the sample source code on GitHub today.
>
>
> Dnia 25 lutego 2025 20:47:36 CE
https://github.com/apache/kafka/pull/18977
czw., 20 lut 2025 o 07:48 Paweł Szymczyk
napisał(a):
> Sure, I would love to continue working on this task and provide an
> interface, I see that someone else has been assigned to Jira recently,
> please let him know that I like to handle it
in it correctly...
>
>But I did file https://issues.apache.org/jira/browse/KAFKA-18836 -- feel free
>to pick it up if you have interest to contribute a fix for Apache Kafka 4.1
>release.
>
>
>-Matthias
>
>
>On 2/19/25 9:17 AM, Paweł Szymczyk wrote:
>> Hello!
>
ed that?
>
>The question is where to put the validation of the array elements. That
>depends also from what you mean with "automatic json schema validation
>feature".
>
>Best,
>Bruno
>
>
>
>
>On 25.02.25 17:22, Paweł Szymczyk wrote:
>> To make
>Have you tried that?
>
>The question is where to put the validation of the array elements. That
>depends also from what you mean with "automatic json schema validation
>feature".
>
>Best,
>Bruno
>
>
>
>
>On 25.02.25 17:22, Paweł Szymczyk wrote:
&g
event":{
"lvl":3,
"someValue":{
"horizontal":11.149606704711914,
"vertical":7.503936767578125
},
"consequence":2,
"timestamp":"2024-04-08T02:06:49.000Z"
}
}
And perfect soultion sh
Then, you can read the
>elements again from the output topic and write them into the table.
>
>Regarding point 1:
>Could you do the validation when you write to the elements to the output topic?
>
>Best,
>Bruno
>
>On 25.02.25 14:20, Paweł Szymczyk wrote:
>> Dear Kaf
Dear Kafka users,
The last few days I spent working with Kafka Streams on some tasks which
looked very easy at first glance but finally I struggled with the Streams
Builder API and did something which I am not proud of. Please help me, I am
open to any suggestions.
On the input topic we have a mes