Hi,
I’d look into Kafka streams https://kafka.apache.org/documentation/streams/. 
You could use your connector to dump all records into a single topic and then 
write a streams application that can use arbitrary logic to route records to 
different downstream topics. A nice benefit, in my opinion, of retaining the 
original single topic is if you ever want to reprocess the records and change 
how you route to different downstream topics you’ll easily be able to do so. 
It’s also nice to have the original history for debugging. You can use log 
compaction to make sure you don’t run out of disk space. 
I’m less familiar with the connector ecosystem so maybe there’s a way to do 
this without writing a streams app in which case hopefully someone else can 
provide other options. 
Hope that helps a bit. 
Andrew.

> On May 7, 2021, at 11:00 AM, Yaniv Shuker <yaniv_shu...@hotmail.com> wrote:
> 
> Hi,
> 
> Can you please advise on the following:
> 
> current behaviour
> 1)I have records in DB
> 2)Using JDBC connector I am taking out the records from DB and sends them 
> into a specific topic
> expected behavior
> take out the records from DB and push them to a different topics based on 
> criteria from the record read from DB
> can you please advise how to do it using build in Kafka mechanism?
> thanks
> 
> 
> 
> [https://ipmcdn.avast.com/images/icons/icon-envelope-tick-round-orange-animated-no-repeat-v1.gif]<https://www.avast.com/sig-email?utm_medium=email&utm_source=link&utm_campaign=sig-email&utm_content=webmail>
>   בלי וירוסים. 
> www.avast.com<https://www.avast.com/sig-email?utm_medium=email&utm_source=link&utm_campaign=sig-email&utm_content=webmail>

Reply via email to