hey guys, resolved the issue...there was an entry in /etc/hosts file with
localhost due to which YARN was trying to connect to spark driver on
localhost of client machine.

Once the entry was removed , it picked the hostname and was able to connect.

Thanks Jongyoul Lee, Todd Nist for help...this forum is really great ..

Naveen, I will try to post some simple steps which I followed for
configuring zeppelin with Yarn most probably tomorrow.


Thanks
Manya

On Wed, Aug 5, 2015 at 5:35 PM, Naveenkumar GP <[email protected]>
wrote:

> No how to do that one..
>
>
>
> *From:* Todd Nist [mailto:[email protected]]
> *Sent:* Wednesday, August 05, 2015 5:34 PM
> *To:* [email protected]
> *Subject:* Re: Exception while submitting spark job using Yarn
>
>
>
> Have you built Zeppelin with against the version of Hadoop & Spark you are
> using?  It has to be build with the appropriate versions as this will pull
> in the required libraries from Hadoop and Spark.  By default Zeppelin will
> not work on Yarn with out doing the build.
>
> @deepujain posted a fairly comprehensive guide on the forum to follow
> with
>
> the steps to take to deploy under Hadoop and Yarn.  It was posted
> yesterday,
>
> August 4th.
>
>
>
> HTH.
>
>
>
> -Todd
>
>
>
>
>
> On Wed, Aug 5, 2015 at 2:35 AM, manya cancerian <[email protected]>
> wrote:
>
> hi Guys,
>
>
>
> I am trying to run Zeppelin using Yarn as resource manager. I have made
> following changes
>
>
>
> 1- I have specified master as 'yarn-client' in the interpreter settings
> using UI
>
> 2. I have specified HADOOP_CONF_DIR as conf directory containing hadoop
> configuration files
>
>
>
> In my scenario I have three machines.
>
> a- Client Machine where zeppelin is installed
>
> b- Machine where YARN resource manager is running along with hadoop
> cluster namenode and datanode
>
> c- Machine running data node
>
>
>
>
>
> When I submit job from my client machine , it gets submitted to yarn but
> fails with following exception -
>
>
>
>
>
> 5/08/04 15:08:05 ERROR yarn.ApplicationMaster: Uncaught exception:
>
> org.apache.spark.SparkException: Failed to connect to driver!
>
>       at 
> org.apache.spark.deploy.yarn.ApplicationMaster.waitForSparkDriver(ApplicationMaster.scala:424)
>
>       at 
> org.apache.spark.deploy.yarn.ApplicationMaster.runExecutorLauncher(ApplicationMaster.scala:284)
>
>       at 
> org.apache.spark.deploy.yarn.ApplicationMaster.run(ApplicationMaster.scala:146)
>
>       at 
> org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$main$1.apply$mcV$sp(ApplicationMaster.scala:575)
>
>       at 
> org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:60)
>
>       at 
> org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:59)
>
>       at java.security.AccessController.doPrivileged(Native Method)
>
>       at javax.security.auth.Subject.doAs(Subject.java:415)
>
>       at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
>
>       at 
> org.apache.spark.deploy.SparkHadoopUtil.runAsSparkUser(SparkHadoopUtil.scala:59)
>
>       at 
> org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:573)
>
>       at 
> org.apache.spark.deploy.yarn.ExecutorLauncher$.main(ApplicationMaster.scala:596)
>
>       at 
> org.apache.spark.deploy.yarn.ExecutorLauncher.main(ApplicationMaster.scala)
>
> 15/08/04 15:08:05 INFO yarn.ApplicationMaster: Final app status: FAILED, 
> exitCode: 10, (reason: Uncaught exception: Failed to connect to driver!)
>
>
>
>
>
> Any help is much appreciated!
>
>
>
> Regards
>
> Manya
>
>
>
> **************** CAUTION - Disclaimer *****************
> This e-mail contains PRIVILEGED AND CONFIDENTIAL INFORMATION intended solely
> for the use of the addressee(s). If you are not the intended recipient, please
> notify the sender by e-mail and delete the original message. Further, you are 
> not
> to copy, disclose, or distribute this e-mail or its contents to any other 
> person and
> any such actions are unlawful. This e-mail may contain viruses. Infosys has 
> taken
> every reasonable precaution to minimize this risk, but is not liable for any 
> damage
> you may sustain as a result of any virus in this e-mail. You should carry out 
> your
> own virus checks before opening the e-mail or attachment. Infosys reserves the
> right to monitor and review the content of all messages sent to or from this 
> e-mail
> address. Messages sent to or from this e-mail address may be stored on the
> Infosys e-mail system.
> ***INFOSYS******** End of Disclaimer ********INFOSYS***
>
>

Reply via email to