Re: FlinkKafkaConsumer010 - creating a data stream of type DataStream>

2017-03-07 Thread Dominik Safaric
Hi Gordon, Thanks for the advice. Following it I’ve implemented the Keyed(De)SerializationSchema and am able to further emit the metadata to downstream operators. Regards, Dominik > On 7 Mar 2017, at 07:08, Tzu-Li (Gordon) Tai wrote: > > Hi Dominik, > > I would recommend implementing a `Ke

Re: FlinkKafkaConsumer010 - creating a data stream of type DataStream>

2017-03-06 Thread Tzu-Li (Gordon) Tai
Hi Dominik, I would recommend implementing a `KeyedSerializationSchema`, and supply it to the constructor when initializing your FlinkKafkaConsumer. The `KeyedDeserializationSchema` exposes the metadata of the record such as offset, partition, and key. In the schema, you can implement your own

FlinkKafkaConsumer010 - creating a data stream of type DataStream>

2017-03-06 Thread Dominik Safaric
Hi, Unfortunately I cannot find the option of using raw ConsumerRecord instances when creating a Kafka data stream. In general, I would like to use an instance of the mentioned type because our use case requires certain metadata such as record offset and partition. So far I’ve examined the so