Hi, I'm trying to run the Spark examples in YARN and I get the following
error:

appDiagnostics: Application application_1390483691679_0124 failed 2 times
due to AM Container for appattempt_1390483691679_0124_000002 exited with 
exitCode: 1 due to: Exception from container-launch: 
org.apache.hadoop.util.Shell$ExitCodeException: 
        at org.apache.hadoop.util.Shell.runCommand(Shell.java:464)
        at org.apache.hadoop.util.Shell.run(Shell.java:379)
        at
org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:589)
        at
org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:195)
        at
org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:283)
        at
org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:79)
        at java.util.concurrent.FutureTask.run(FutureTask.java:262)
        at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:724)

This is the command to run:
SPARK_JAR=/opt/spark/assembly/target/spark-assembly_2.10-0.9.0-incubating.jar
/opt/spark/bin/spark-class org.apache.spark.deploy.yarn.Client --jar
/opt/spark/examples/target/scala-2.10/spark-examples_2.10-assembly-0.9.0-incubating.jar
--class org.apache.spark.examples.SparkPi --args yarn-standalone

And my spark-env.sh file:
export SCALA_HOME=/opt/scala
export SPARK_LIBRARY_PATH="/usr/lib/hadoop/lib/native:$SPARK_LIBRARY_PATH"
export HADOOP_CONF_DIR="/etc/hadoop/conf/"

Am I doing something wrong?




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-in-YARN-HDP-problem-tp2041.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Reply via email to