Hi Robin

Is it possible to integrate Apache Kafka with that confluent schema
registry like u said ??

I don't know how to do,can u able to give any reference?

On Mon, May 11, 2020, 14:09 Robin Moffatt <ro...@confluent.io> wrote:

> You can use Apache Kafka as you are currently using, and just deploy Schema
> Registry alongside it.
>
>
> --
>
> Robin Moffatt | Senior Developer Advocate | ro...@confluent.io | @rmoff
>
>
> On Sat, 9 May 2020 at 02:16, Chris Toomey <ctoo...@gmail.com> wrote:
>
> > You have to either 1) use one of the Confluent serializers
> > <
> >
> https://docs.confluent.io/current/schema-registry/serdes-develop/index.html#
> > >
> > when you publish to the topic, so that the schema (or reference to it) is
> > included, or 2) write and use a custom converter
> > <
> >
> https://kafka.apache.org/25/javadoc/org/apache/kafka/connect/storage/Converter.html
> > >
> > that knows about the data schema and can take the kafka record value and
> > convert it into a kafka connect record (by implementing the toConnectData
> > converter method), which is what the sink connectors are driven from.
> >
> > See https://docs.confluent.io/current/connect/concepts.html#converters
> >
> > Chris
> >
> >
> >
> > On Fri, May 8, 2020 at 6:59 AM vishnu murali <vishnumurali9...@gmail.com
> >
> > wrote:
> >
> > > Hey Guys,
> > >
> > > I am *using Apache **2.5 *not confluent.
> > >
> > > i am trying to send data from topic to database using jdbc sink
> > connector.
> > >
> > > we need to send that data with the appropriate schema also.
> > >
> > > i am *not using confluent version* of kafka.
> > >
> > > so can anyone explain how can i do this ?
> > >
> >
>

Reply via email to