dropped in recent
> Spark versions.
>
> > On 30 Jan 2017, at 00:44, aravasai <[hidden email]
> <http:///user/SendEmail.jtp?type=node&node=20796&i=0>> wrote:
> >
> > I have a spark job running on 2 terabytes of data which creates more
> than
> >
I have a spark job running on 2 terabytes of data which creates more than
30,000 partitions. As a result, the spark job fails with the error
"Map output statuses were 170415722 bytes which exceeds spark.akka.frameSize
52428800 bytes" (For 1 TB data)
However, when I increase the akka.frame.size to