Is your topic same in both the case?
On Tue, 24 Jul 2018, 19:15 Biswajit Ghosh,
wrote:
> Hi team,
>
> I got an issue while integrating with the spark streaming using pyspark, I
> did receive the video stream data in a different consumer subscribe to the
> same topic.
>
> Works fine with this com
As per my understanding, you should set the retention period to Long.MAX
hours. This will ensure that your messages won't get compacted because
retention period is huge.
Regards,
Aman
On Mon, Jun 25, 2018 at 8:27 AM, Barathan Kulothongan wrote:
> Hi There, I am currently reading the Kafka Defin
huge.
>
> The advantage is, that you don't need to worry about generating unique
> keys.
>
> -Matthias
>
> On 1/23/18 10:39 AM, Aman Rastogi wrote:
> > Thanks Svante.
> >
> > Regards,
> > Aman
> >
> > On Tue, Jan 23, 2018 at 11:38 PM,
Thanks Svante.
Regards,
Aman
On Tue, Jan 23, 2018 at 11:38 PM, Svante Karlsson
wrote:
> Yes, it will store the last value for each key
>
> 2018-01-23 18:30 GMT+01:00 Aman Rastogi :
>
> > Hi All,
> >
> > We have a use case to store stream for infinite time (give
Hi All,
We have a use case to store stream for infinite time (given we have enough
storage).
We are planning to solve this by Log Compaction. If each message key is
unique and Log compaction is enabled, it will store whole stream for
infinite time. Just wanted to check if my assumption is correct