No, but, the JVM also does not allocate memory for native code on the heap.
I dont think heap has any bearing on whether your native code can't
allocate more memory except that of course the heap is also taking memory.
On Oct 30, 2014 6:43 PM, "Paul Wais" <pw...@yelp.com> wrote:

> Dear Spark List,
>
> I have a Spark app that runs native code inside map functions.  I've
> noticed that the native code sometimes sets errno to ENOMEM indicating
> a lack of available memory.  However, I've verified that the /JVM/ has
> plenty of heap space available-- Runtime.getRuntime().freeMemory()
> shows gigabytes free and the native code needs only megabytes.  Does
> spark limit the /native/ heap size somehow?  Am poking through the
> executor code now but don't see anything obvious.
>
> Best Regards,
> -Paul Wais
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to