Dear Spark List,

I have a Spark app that runs native code inside map functions.  I've
noticed that the native code sometimes sets errno to ENOMEM indicating
a lack of available memory.  However, I've verified that the /JVM/ has
plenty of heap space available-- Runtime.getRuntime().freeMemory()
shows gigabytes free and the native code needs only megabytes.  Does
spark limit the /native/ heap size somehow?  Am poking through the
executor code now but don't see anything obvious.

Best Regards,
-Paul Wais

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to