This sounds like a windowed join between the raw stream and the aggregated
stream.
It might be possible to do the "lookup" in the second raw stream with
another windowed join. If not, you can fall back to the DataStream API /
ProcessFunction and implement the lookup logic as you need it.
Best, Fab
Thanks Fabian. I tried to use "rowtime" and Flink tells me below exception:
*Exception in thread "main" org.apache.flink.table.api.ValidationException:
SlidingGroupWindow('w2, 'end, 150.rows, 1.rows) is invalid: Event-time
grouping windows on row intervals in a stream environment are currently not
Sorry, I forgot to CC the user mailing list in my reply.
2018-04-12 17:27 GMT+02:00 Fabian Hueske :
> Hi,
>
> Assuming you are using event time, the right function to generate a row
> time attribute from a window would be "w1.rowtime" instead of "w1.start".
>
> The reason why Flink is picky about
Hi all,
I'd like to use 2 window group in a chain in my program as below.
Table myTable = cTable
.window(Tumble.*over*("15.seconds").on("timeMill").as("w1"))
.groupBy("symbol, w1").select("w1.start as start, w1.end as end,
symbol, price.max as p_max, price.min as p_min")