Hi all, Today I hit a weird bug in Spark 2.0.2 (vanilla Spark) - the executor tab shows negative number of active tasks.
I have about 25 jobs, each with 20k tasks so the numbers are not that crazy. What could possibly the cause of this bug? This is the first time I've seen it and the only special thing I'm doing is saving multiple datasets at the same time to HDFS from different threads. Thanks, Andy
--------------------------------------------------------------------- To unsubscribe e-mail: dev-unsubscr...@spark.apache.org