Not sure if this make sense, but maybe would be nice to have a kind of "flag"
available within the code that tells me if I'm running in a "normal"
situation or during a recovery.
To better explain this, let's consider the following scenario:
I am processing data, let's say from a Kafka streaming, and I am updating a
database based on the computations. During the recovery I don't want to
update again the database (for many reasons, let's just assume that) but I
want my system to be in the same status as before, thus I would like to know
if my code is running for the first time or during a recovery so I can avoid
to update the database again.
More generally I want to know this in case I'm interacting with external
entities.




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Streaming-checkpoint-recovery-causes-IO-re-execution-tp12568p13009.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to