Hi Ashesh

You might be experiencing problems with the virtual memory allocation.
Try grepping the yarn-hadoop-nodemanager-*.log (found in
$HADOOP_INSTALL/logs) for 'virtual memory limits'
If you se a message like
---------
WARN 
org.apache.hadoop.yarn.server.nodemanager.containermanager.monitor.ContainersMonitorImpl:
Container [pid=5692,containerID=container_1460451108851_0001_02_000001]
is running beyond virtual memory limits. Current usage: 248.7 MB of 1
GB physical memory used; 2.1 GB of 2.1 GB virtual memory used. Killing
container.
---------
thats you problem.

You can solve it setting the yarn.nodemanager.vmem-pmem-ratio to a
higher ratio in yarn-site.xml like this:
----------
 <property>
   <name>yarn.nodemanager.vmem-pmem-ratio</name>
    <value>5</value>
    <description>Ratio between virtual memory to physical memory when
setting memory limits for containers</description>
  </property>
----------

You can also totally disable the vmem-check.

See https://issues.apache.org/jira/browse/YARN-4714 for further info

/ Jon
/ jokja
Venlig hilsen/Best regards

Jon Kjær Amundsen
Information Architect & Product Owner

Phone: +45 7023 9080
Direct: +45 8882 1331
E-mail: j...@udbudsvagten.dk
Web: www.udbudsvagten.dk
Nitivej 10 | DK - 2000 Frederiksberg


Intelligent Offentlig Samhandel
Før, under og efter udbud

Følg UdbudsVagten og markedet her Linkedin


2016-04-12 8:06 GMT+02:00 ashesh_28 <asheshro...@gmail.com>:
> I have updated all my nodes in the Cluster to have 4GB RAM memory , but still
> face the same error when trying to launch Spark-Shell  in yarn-client mode
>
> Any suggestion ?
>
>
>
> --
> View this message in context: 
> http://apache-spark-user-list.1001560.n3.nabble.com/Running-Spark-on-Yarn-Client-Cluster-mode-tp26691p26752.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to