Although this may not be natively supported, you can mimic this behavior. By having a micro batch time of 1 minute. Then on your updateStateByKey check how long the session has been running. If it's longer than 10 minutes, return an empty key so that it's removed from the stream.
On Fri, Dec 2, 2016 at 12:13 PM, Michael Armbrust <mich...@databricks.com> wrote: > Here is the JIRA for adding this feature: > https://issues.apache.org/jira/browse/SPARK-10816 > > On Fri, Dec 2, 2016 at 11:20 AM, Fritz Budiyanto <fbudi...@icloud.com> > wrote: >> >> Hi All, >> >> I need help on how to implement Flink event session window in Spark. Is >> this possible? >> >> For instance, I wanted to create a session window with a timeout of 10 >> minutes (see Flink snippet below) >> Continues event will make the session window alive. If there are no >> activity for 10 minutes, the session window shall close and forward the data >> to a sink function. >> >> // event-time session windows >> input >> .keyBy(<key selector>) >> .window(EventTimeSessionWindows.withGap(Time.minutes(10))) >> .<windowed transformation>(<window function>); >> >> >> >> Any idea ? >> >> Thanks, >> Fritz > > --------------------------------------------------------------------- To unsubscribe e-mail: dev-unsubscr...@spark.apache.org