Is YARN_CONF_DIR set?

--- Original Message ---

From: "Aniket Bhatnagar" <aniket.bhatna...@gmail.com>
Sent: February 4, 2015 6:16 AM
To: "kundan kumar" <iitr.kun...@gmail.com>, "spark users" 
<user@spark.apache.org>
Subject: Re: Spark Job running on localhost on yarn cluster

Have you set master in SparkConf/SparkContext in your code? Driver logs
show in which mode the spark job is running. Double check if the logs
mention local or yarn-cluster.
Also, what's the error that you are getting?

On Wed, Feb 4, 2015, 6:13 PM kundan kumar <iitr.kun...@gmail.com> wrote:

> Hi,
>
> I am trying to execute my code on a yarn cluster
>
> The command which I am using is
>
> $SPARK_HOME/bin/spark-submit --class "EDDApp"
> target/scala-2.10/edd-application_2.10-1.0.jar --master yarn-cluster
> --num-executors 3 --driver-memory 6g --executor-memory 7g <outpuPath>
>
> But, I can see that this program is running only on the localhost.
>
> Its able to read the file from hdfs.
>
> I have tried this in standalone mode and it works fine.
>
> Please suggest where is it going wrong.
>
>
> Regards,
> Kundan
>

Reply via email to