;> private long currentMaxTimestamp;
>>>
>>> @Override
>>> public long extractTimestamp(Tuple3
>>> element, long previousElementTimestamp) {
>>> long timestamp = element.f0.getMillis();
>>> currentMaxTimest
usElementTimestamp) {
>> long timestamp = element.f0.getMillis();
>> currentMaxTimestamp = Math.max(timestamp,
>> currentMaxTimestamp);
>> return timestamp;
>> }
>>
>> @Override
>> public Watermark
ess);
> }
> }
>
> Thanks,
> Chris
>
>
>
> --
> View this message in context:
> http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/Multiple-windows-with-large-number-of-partitions-tp6521p6562.html
> Sent from the Apache Flink User Mailing List archive. mailing list archive
> at Nabble.com.
>
@Override
public Watermark getCurrentWatermark() {
return new Watermark(currentMaxTimestamp - maxOutOfOrderness);
}
}
Thanks,
Chris
--
View this message in context:
http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/Multiple-windows-with-large-number-of-pa
Hi,
is there are reason for keying on both the "date only" field and the
"userid". I think you should be fine by just specifying that you want 1-day
windows on your timestamps.
Also, do you have a timestamp extractor in place that takes the timestamp
from your data and sets it as the internal time
I've been working through the flink demo applications and started in on a
prototype, but have run into an issue with how to approach the problem of
getting a daily unique user count from a traffic stream. I'm using a time
characteristic event time.
Sample event stream
(timestamp,userid):
2015-12-