Is that possible, if not, how would one do it from PySpark ? This probably does not make sense in most cases, but am writing a script where my job involves downloading and pushing data into cassandra.. sometimes a task hangs forever, and I dont really mind killing it.. The job is not actually computing some result that requires all tasks to succeed.
Thanks, Mohamed.
