thanks..I go through the schema registry..

On 3 May 2016 at 22:46, Gerard Klijs <gerard.kl...@dizzit.com> wrote:

> Then you're probably best of using the confluent schema registry, you can
> then use the io.confluent.kafka.serializers.KafkaAvroDeserializer for the
> client with KafkaAvroDeserializerConfig.SPECIFIC_AVRO_READER_CONFIG="true"
> to
> get back the object, deserialized with the same version of the schema the
> object was sent with.
>
> On Tue, May 3, 2016 at 12:06 PM Ratha v <vijayara...@gmail.com> wrote:
>
> > I plan to use different topics for each type of object. (number of object
> > types= number of topics)..
> > So, I need deserializers/serializers= topics = number of objects.
> >
> > What would be the better way to achieve this?
> >
> > On 3 May 2016 at 18:20, Gerard Klijs <gerard.kl...@dizzit.com> wrote:
> >
> > > If you put them in one topic, you will need one
> > > 'master' serializer/deserializers which can handle all the formats.
> > > I don't know how you would like to use Avro schemas, the confluent
> schema
> > > registry is by default configured to handle one schema at a time for
> one
> > > topic, but you could configure it to use multiple non-compatible
> schema's
> > > in one topic. Each object will be saved with a schema id, making it
> > > possible to get back the original object.
> > >
> > > On Tue, May 3, 2016 at 1:52 AM Ratha v <vijayara...@gmail.com> wrote:
> > >
> > > > What is the best way for this? Do we need to have common
> > > > serializer/deserializer for all type of the objects we publish? OR
> > > seperate
> > > > for each objects?
> > > > If we have seperate serializer/deserializers, then how can I
> configure
> > > > kafka?
> > > > Or Is it recommended to use Avro schemas?
> > > >
> > > > Thanks
> > > >
> > > > On 2 May 2016 at 18:43, Gerard Klijs <gerard.kl...@dizzit.com>
> wrote:
> > > >
> > > > > I think by design it would be better to put different kind of
> > messages
> > > > in a
> > > > > different topic. But if you would want to mix you can make your own
> > > > > serializer/deserializer you could append a 'magic byte' to the byes
> > you
> > > > get
> > > > > after you serialize, to be able to deserialize using the correct
> > > methods.
> > > > > The custom serializer would always return an Object, which you
> could
> > > cast
> > > > > when needed in the poll loop of the consumer. I think this is de
> > > > > cleanest/best way, but maybe someone has a different idea?
> > > > >
> > > > > On Mon, May 2, 2016 at 7:54 AM Ratha v <vijayara...@gmail.com>
> > wrote:
> > > > >
> > > > > > Hi all;
> > > > > >
> > > > > > Say, I publish and consume different type of java objects.For
> each
> > I
> > > > have
> > > > > > to define own serializer implementations. How can we provide all
> > > > > > implementations in the kafka consumer/producer properties file
> > under
> > > > the
> > > > > > "serializer.class" property?
> > > > > >
> > > > > >
> > > > > > --
> > > > > > -Ratha
> > > > > > http://vvratha.blogspot.com/
> > > > > >
> > > > >
> > > >
> > > >
> > > >
> > > > --
> > > > -Ratha
> > > > http://vvratha.blogspot.com/
> > > >
> > >
> >
> >
> >
> > --
> > -Ratha
> > http://vvratha.blogspot.com/
> >
>



-- 
-Ratha
http://vvratha.blogspot.com/

Reply via email to