Try tuning the options like memoryFraction and executorMemory found here :
http://spark.apache.org/docs/latest/configuration.html.
Thanks
Prashant Sharma
On Mon, May 5, 2014 at 9:34 PM, Ajay Nair wrote:
> Hi,
>
> Is there any way to overcome this error? I am running this from the
> spark-shel
Hi,
Is there any way to overcome this error? I am running this from the
spark-shell, is that the cause of concern ?
--
View this message in context:
http://apache-spark-developers-list.1001551.n3.nabble.com/Apache-spark-on-27gb-wikipedia-data-tp6487p6490.html
Sent from the Apache Spark Develop
I just thought may be we could put a warning whenever that error comes user
can tune either memoryFraction or executor memory options. And this warning
get's displayed when TaskSetManager receives task failures due to OOM.
Prashant Sharma
On Mon, May 5, 2014 at 2:10 PM, Ajay Nair wrote:
> Hi,