If you are creating a huge map on the driver, then spark.driver.memory
should be set to a higher value to hold your map. Since you are going to
broadcast this map, your spark executors must have enough memory to hold
this map as well which can be set using the spark.executor.memory, and
spark.storage.memoryFraction configurations.

Thanks
Best Regards

On Mon, Dec 21, 2015 at 5:50 AM, Pat Ferrel <p...@occamsmachete.com> wrote:

> I have a large Map that is assembled in the driver and broadcast to each
> node.
>
> My question is how best to allocate memory for this.  The Driver has to
> have enough memory for the Maps, but only one copy is serialized to each
> node. What type of memory should I size to match the Maps? Is the broadcast
> Map taking a little from each executor, all from every executor, or is
> there something other than driver and executor memory I can size?
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to