Thanks a lot Mayuresh. I will look into SearchMessageByTimestamp feature in
Kafka ..

Cheers,
Senthil

On Thu, May 25, 2017 at 1:12 PM, Mayuresh Gharat <gharatmayures...@gmail.com
> wrote:

> Hi Senthil,
>
> Kafka does allow search message by timestamp after KIP-33 :
> https://cwiki.apache.org/confluence/display/KAFKA/KIP-
> 33+-+Add+a+time+based+log+index#KIP-33-Addatimebasedlogindex-
> Searchmessagebytimestamp
>
> The new consumer does provide you a way to get offsets by timestamp. You
> can use these offsets to seek to that offset and consume from there. So if
> you want to consume between a range you can get the start and end offset
> based on the timestamps, seek to the start offset and consume and process
> the data till you reach the end offset.
>
> But these timestamps are either CreateTime(when the message was created
> and you will have to specify this when you do the send()) or
> LogAppendTime(when the message was appended to the log on the kafka broker)
> : https://kafka.apache.org/0101/javadoc/org/apache/kafka/clients/producer/
> ProducerRecord.html
>
> Kafka does not look at the fields in your data (key/value) for giving back
> you the data. What I meant was it will not look at the timestamp specified
> by you in the actual data payload.
>
> Thanks,
>
> Mayuresh
>
> On Thu, May 25, 2017 at 12:43 PM, SenthilKumar K <senthilec...@gmail.com>
> wrote:
>
>> Hello Dev Team, Pls let me know if any option to read data from Kafka (all
>> partition ) using timestamp . Also can we set custom offset value to
>> messages ?
>>
>> Cheers,
>> Senthil
>>
>> On Wed, May 24, 2017 at 7:33 PM, SenthilKumar K <senthilec...@gmail.com>
>> wrote:
>>
>> > Hi All ,  We have been using Kafka for our Use Case which helps in
>> > delivering real time raw logs.. I have a requirement to fetch data from
>> > Kafka by using offset ..
>> >
>> > DataSet Example :
>> > {"access_date":"2017-05-24 13:57:45.044","format":"json",
>> > "start":"1490296463.031"}
>> > {"access_date":"2017-05-24 13:57:46.044","format":"json",
>> > "start":"1490296463.031"}
>> > {"access_date":"2017-05-24 13:57:47.044","format":"json",
>> > "start":"1490296463.031"}
>> > {"access_date":"2017-05-24 13:58:02.042","format":"json",
>> > "start":"1490296463.031"}
>> >
>> > Above JSON data will be stored in Kafka..
>> >
>> > Key --> acces_date in epoch format
>> > Value --> whole JSON.
>> >
>> > Data Access Pattern:
>> >   1) Get me last 2 minz data ?
>> >    2) Get me records between 2017-05-24 13:57:42:00 to 2017-05-24
>> > 13:57:44:00 ?
>> >
>> > How to achieve this in Kafka ?
>> >
>> > I tried using SimpleConsumer , but it expects partition and not sure
>> > SimpleConsumer would match our requirement...
>> >
>> > Appreciate you help !
>> >
>> > Cheers,
>> > Senthil
>> >
>>
>
>
>
> --
> -Regards,
> Mayuresh R. Gharat
> (862) 250-7125
>

Reply via email to