Hi,
We have a 6 node spark cluster and have some pyspark jobs running on it.
The job is dependent on external application and to have resiliency we try a
couple of times.
Will it be fine to induce some wait time between two runs(using
time.sleep()) ? Or could there by any sync issues?
Wanted to understand the behaviour/issues if any.

Thanks,
Praneeth





--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to