If you don't want to send the schema each time then serialise your data
using Avro (or Protobuf), and then the schema is held in the Schema
Registry. See https://www.youtube.com/watch?v=b-3qN_tlYR4&t=981s

If you want to update a record insert of insert, you can use the upsert
mode. See https://www.youtube.com/watch?v=b-3qN_tlYR4&t=627s


-- 

Robin Moffatt | Senior Developer Advocate | ro...@confluent.io | @rmoff


On Thu, 7 May 2020 at 06:48, vishnu murali <vishnumurali9...@gmail.com>
wrote:

> Hey Guys,
>
> i am working on JDBC Sink Conneector to take data from kafka topic to
> mysql.
>
> i am having 2 questions.
>
> i am using normal Apache Kafka 2.5 not a confluent version.
>
> 1)For inserting data every time we need to add the schema data also with
> every data,How can i overcome this situation?i want to give only the data.
>
> 2)In certain time i need to update the existing record without adding as a
> new record.How can i achieve this?
>

Reply via email to