How big is your driver heap size? And any reason why you'd need 200k map
and 200k reduce tasks?


On Mon, Oct 19, 2015 at 11:59 PM, yaoqin <yao...@huawei.com> wrote:

> Hi everyone,
>
>     When I run a spark job contains quite a lot of tasks(in my case is
> 200,000*200,000), the driver occured OOM mainly caused by the object
> MapStatus,
>
> As is shown in the pic bellow, RoaringBitmap that used to mark which block
> is empty seems to use too many memories.
>
>     Are there any data structue can replace RoaringBitmap to fix my
> problem?
>
>
>
>     Thank you!
>
>     Qin.
>
>

Reply via email to