I think you need to implement a timeout in your code. As far as I know, Spark will not interrupt the execution of your code as long as the driver is connected. Might be an idea though.
On Tue, Jun 17, 2014 at 7:54 PM, Peng Cheng <pc...@uow.edu.au> wrote: > I've tried enabling the speculative jobs, this seems partially solved the > problem, however I'm not sure if it can handle large-scale situations as it > only start when 75% of the job is finished. > > > > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/What-is-the-best-way-to-handle-transformations-or-actions-that-takes-forever-tp7664p7752.html > Sent from the Apache Spark User List mailing list archive at Nabble.com. >