Re: Accumulators of Spark 1.x no longer work with Spark 2.x

2018-03-15 Thread Sergey Zhemzhitsky
One more option is to override writeReplace [1] in LegacyAccumulatorWrapper to prevent such failures. What do you think? [1] https://github.com/apache/spark/blob/4f5bad615b47d743b8932aea1071652293981604/core/src/main/scala/org/apache/spark/util/AccumulatorV2.scala#L158 On Fri, Mar 16, 2018 at 1

Re: accumulators

2014-10-17 Thread Reynold Xin
It certainly makes sense for a single streaming job. But it is definitely non-trivial to make this useful to all Spark programs. If I were to have a long running SParkContext and submit a wide variety of jobs to it, this would make the list of accumulators very, very large. Maybe the solution is