Hi,

I would like to cache values and to use only the latest "valid" values to build 
a sum.
In more detail, I receive values from devices periodically. I would like to add 
up all the valid values each minute. But not every device sends a new value 
every minute. And as long as there is no new value the old one should be used 
for the sum. As soon as I receive a new value from a device I would like to 
overwrite the old value and to use the new one for the sum. Would that be 
possible with Spark Streaming only? Or would I need a kind of distributed 
cache, like Redis? I also need to group the sums per region. Should that be 
done before I store the values in the cache or afterwards?

Thank you in advance.

Regards,
Daniela

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to