Hi Liam, 

I have consumed the avro record using the java code:
for (final ConsumerRecord<String, GenericRecord> record : records) {
    final String key = record.key();
    final GenericRecord value = record.value();
    System.out.println(record.value().getSchema());
    System.out.printf("key = %s, value = %s%n", key, value);
}
Next I need to write it to database using the existing kafka jdbc sink 
connector API: 

Seems i need to consolidate the code here: 
https://github.com/confluentinc/kafka-connect-jdbc/ 
Just new a JDBCSinkTask,  add the record to the JDBCSinkTask, then the task 
will automatically  genterate the sql according the record schema and execute 
it, no matter what the table is.

But i have no idea how  to get it.

Thanks,
Lei





wangl...@geekplus.com.cn

 
From: Liam Clarke-Hutchinson
Date: 2020-05-09 18:20
To: users
Subject: Re: Re: Write to database directly by referencing schema registry, no 
jdbc sink connector
Hi Lei,
 
This tutorial will introduce you to the Avro consumers.
https://docs.confluent.io/current/schema-registry/schema_registry_tutorial.html
 
In terms of going from Avro record to SQL, the JDBC sink generates SQL
based on the field names in the schema, and configured table names.
 
IIRC, the Avro consumer returns an Avro GenericRecord.Record[1], which has
a getSchema() method that returns the schema used to deserialise it,  so
you could access that to generate the SQL.
 
[1]:
https://avro.apache.org/docs/current/api/java/org/apache/avro/generic/GenericData.Record.html
 
Good luck,
 
Liam Clarke-Hutchinson
 
On Sat, 9 May 2020, 10:03 pm wangl...@geekplus.com.cn, <
wangl...@geekplus.com.cn> wrote:
 
>
> Thanks Liam,
>
> I want to achive the following  function using java code:
>
> For each avro serialized record received:
>          1  deserialized the record automatically  by referencing  schema
> registry
>          2  change the record to a sql statement needed to be executed and
> execute it
>
> Seems the kafka jdbc sink connector (
> https://docs.confluent.io/current/connect/kafka-connect-jdbc/sink-connector/index.html)
> can achieve this function.
>
> But i have no idea how to write with java code.
> Is there any code example to achieve this?
>
> Thanks,
> Lei
>
>
>
> wangl...@geekplus.com.cn
>
>
> From: Liam Clarke-Hutchinson
> Date: 2020-05-09 16:30
> To: users
> Subject: Re: Write to database directly by referencing schema registry, no
> jdbc sink connector
> Hi Lei,
>
> You could use the Kafka Avro consumer to deserialise records using the
> Schema Registry automatically.
>
> Then write to the DB as you see fit.
>
> Cheers,
>
> Liam Clarke-Hutchinson
>
> On Sat, 9 May 2020, 2:38 pm wangl...@geekplus.com.cn, <
> wangl...@geekplus.com.cn> wrote:
>
> >
> > Using debezium to parse binlog, using avro serialization and send to
> kafka.
> >
> > Need to consume the avro serialized  binlog data and  wirite  to target
> > database
> > I want to use self-written  java code instead of kafka jdbc sink
> > connector.
> >
> > How can i  reference the schema registry, convert a kafka message to
> > corresponding table record and write to corresponding table?
> > Is there any example code to do this ?
> >
> > Thanks,
> > Lei
> >
> >
> >
> > wangl...@geekplus.com.cn
> >
> >
>

Reply via email to