Thanks josh ... i'll take a look
On 31 Aug 2015 19:21, "Josh Rosen" wrote:
> There are currently a few known issues with using KryoSerializer as the
> closure serializer, so it's going to require some changes to Spark if we
> want to properly support this. See
> https://github.com/apache/spark/pu
Hi devs,
Curently the only supported serializer for serializing tasks in
DAGScheduler.scala is JavaSerializer.
val taskBinaryBytes: Array[Byte] = stage match {
case stage: ShuffleMapStage =>
closureSerializer.serialize((stage.rdd, stage.shuffleDep): AnyRef).array()
case stage: ResultStag