Hi,

Alternatively if you would like to create continuous aggregates per key you
can use ds.groupBy().reduce(..), or use one of the stateful functions in
the scala api such as mapWithState.

For a rolling average per key you can check this exmple:
https://github.com/gyfora/summer-school/blob/master/flink/src/main/scala/summerschool/FlinkKafkaExample.scala

Cheers,
Gyula

On Fri, Aug 21, 2015 at 3:28 PM Aljoscha Krettek <aljos...@apache.org>
wrote:

> Hi,
> with the current API this should do what you are after:
>
> val input = ...
>
> val result = input
>   .window(...)
>   .groupBy(...)
>   .reduceWindow( /* your reduce function */ )
>
> With the reduce function you should be able to implement any custom
> aggregations. You can also use foldWindow() if you want to do a functional
> fold over the window.
>
> I hope this helps.
>
> Cheers,
> Aljoscha
>
> On Fri, 21 Aug 2015 at 14:51 Philipp Goetze <philipp.goe...@tu-ilmenau.de>
> wrote:
>
>> Hello community,
>>
>> how do I define a custom aggregate function in Flink Streaming (Scala)?
>> Could you please provide an example on how to do that?
>>
>> Thank you and best regards,
>> Philipp
>>
>

Reply via email to