Hi,
As far as my understanding goes, aggregated result for a window is not
included in next window.
Window would stay in state store till it gets deleted based on certain
setting however aggregated result for that window will include only the
records that occur within the window duration.
If you h
Thanks John.
That partially answers my question.
I'm a little confused about when a window will expire.
As you said, I will receive at most 20 events at T2 but as time goes on
will the data from the first window always be included in the aggregated
result?
On Mon, Jan 20, 2020 at 7:55 AM John Roes
Hi Sushrut,
I have to confess I don’t think I fully understand your last message, but I
will try to help.
It sounds like maybe you’re thinking that streams would just repeatedly emit
everything every commit? That is certainly not the case. If there are only 10
events in window 1 and 10 in wind
Hey John,
I tried following the docs here about the configs:
`streamsConfiguration.put(StreamsConfig.CACHE_MAX_BYTES_BUFFERING_CONFIG,
10 * 1024 * 1024L);
// Set commit interval to 1 second.
streamsConfiguration.put(StreamsConfig.COMMIT_INTERVAL_MS_CONFIG, 1000);`
https://kafka.apache.org/10/docum
Thanks John,
I'll try increasing the "CACHE_MAX_BYTES_BUFFERING_CONFIG"
and "COMMIT_INTERVAL_MS_CONFIG" configurations.
Thanks,
Sushrut
On Sat, Jan 18, 2020 at 11:31 AM John Roesler wrote:
> Ah, I should add, if you actually want to use suppression, or
> you need to resolve a similar error mess
Ah, I should add, if you actually want to use suppression, or
you need to resolve a similar error message in the future, you
probably need to tweak the batch sizes and/or timeout configs
of the various clients, and maybe the server as well.
That error message kind of sounds like the server went si
Hi Sushrut,
That's frustrating... I haven't seen that before, but looking at the error
in combination with what you say happens without suppress makes
me think there's a large volume of data involved here. Probably,
the problem isn't specific to suppression, but it's just that the
interactions on
Hey,
I'm building a streams application where I'm trying to aggregate a stream
of events
and getting a list of events per key.
`eventStream
.groupByKey(Grouped.with(Serdes.String(), eventSerde))
.windowedBy(TimeWindows.of(Duration.ofMillis(50)).grace(Duration.ofMillis(1)))
.aggregate(
ArrayLis