I am getting following error in spark task. Default max value is 64mb!
Document says it should be large enough to store largest object in my
application. I don't think I have any object thhhat is bigger then 64mb. SO
what these values (spark.kryoserializer.buffer,
spark.kryoserializer.buffer.max) means?

Is that a buffer per executor or buffer per executor per core ? I have 6
cores per executors so do all 6 are writing to this common buffer?   in
that case I have 16mb buffer per core. Please explain. Thanks!


Job aborted due to stage failure: Task 3 in stage 4.0 failed 10 times, most
recent failure: Lost task 3.9 in stage 4.0 (TID 16,
iadprd01mpr005.mgmt-a.xactly.iad.dc.local): org.apache.spark.SparkException:
*Kryo serialization failed: Buffer overflow. Available: 0, required: 19. To
avoid this, increase spark.kryoserializer.buffer.max value.*
at org.apache.spark.serializer.KryoSerializerInstance.
serialize(KryoSerializer.scala:300)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:313)
at java.util.concurrent.ThreadPoolExecutor.runWorker(
ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(
ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)

Reply via email to