Hello.

I have a streaming job that is processing data.  I process a stream of
events, taking actions when I see anomalous events.  I also keep a count
events observed using updateStateByKey to maintain a map of type to count.
I would like to periodically (every 5 minutes) write the results of my
counts to a database.  Is there a built in mechanism or established
pattern to execute periodic jobs in spark streaming?

Regards,

Bryan Jeffrey

Reply via email to