Hi,
I am facing this issue:
2022-07-01 19:01:05,548] INFO Topic 'postgres.public.content_history'
already exists. (org.apache.kafka.connect.runtime.WorkerSourceTask:423)
[2022-07-01 19:01:05,641] INFO
WorkerSourceTask{id=smtip-de-content2-source-connector-0} Committing
offsets (org.apache.kafka.co
Hi, you need to increase record and message size because your real message
payload is bigger than what’s mention in properties file.
Regards,
On Fri, 1 Jul 2022 at 20:24, Divya Jain
wrote:
> Hi,
>
> I am facing this issue:
> 2022-07-01 19:01:05,548] INFO Topic 'postgres.public.content_history'
Hi,
It's the max that could be defined. It cannot go beyond it. I am not sure
how to solve this.
Thanks
Divya Jain
On Sat, 2 Jul, 2022, 1:40 am M. Manna, wrote:
> Hi, you need to increase record and message size because your real message
> payload is bigger than what’s mention in properties f
Hi Divya,
Something that we've found useful in the past is to have a secondary
key-value cache that can store large amounts of data under a key, then pass
the key through Kafka for the consumer to use for retrieval on the other
end. Usually leads to much better performance from Kafka's perspective
Hi.
According to what you have said, is it possible to do the key-value cache
at source connector level. Because I checked some of the documentation but
even if i am passing cache.max.bytes.buffering in my worker properties, it
sort of gives me the same problem. It's more of an integration with
Ks