Re: Zeppelin in local computer using yarn on distant cluster

2016-11-02 Thread Benoit Hanotte
I am pointing to the dirs on my local machine, what I want is simply for my jobs to be submitted to the distant yarn cluster Thanks On Wed, Nov 2, 2016 at 4:00 PM, Abhi Basu <9000r...@gmail.com> wrote: > I am assuming you are pointing to hadoop/spark on remote host, right? Can > you not point ha

Re: Zeppelin in local computer using yarn on distant cluster

2016-11-02 Thread Abhi Basu
I am assuming you are pointing to hadoop/spark on remote host, right? Can you not point hadoop conf and spark dirs to remote machine? Not sure if this works, just suggesting, others may have tried. On Wed, Nov 2, 2016 at 9:58 AM, Hyung Sung Shim wrote: > Hello. > You don't need to install hadoop

Re: Zeppelin in local computer using yarn on distant cluster

2016-11-02 Thread Hyung Sung Shim
Hello. You don't need to install hadoop in your machine but you need a proper version of spark[0] to use spark-submit. and then you can set[1] the SPARK_HOME where the spark exists and HADOOP_CONF_DIR, master as yarn-client your spark interpreter in the interpreter menu. [0] http://spark.apache.or

Re: Zeppelin in local computer using yarn on distant cluster

2016-11-02 Thread Benoit Hanotte
I have only set HADOOP_CONF_DIR as following (my hadoop conf files are in /usr/local/lib/hadoop/etc/hadoop/, eg /usr/local/lib/hadoop/etc/hadoop/yarn-site.xml): #!/bin/bash # # Licensed to the Apache Software Foundation (ASF) under one or more # contributor license agreements. See

Re: Zeppelin in local computer using yarn on distant cluster

2016-11-02 Thread Hyung Sung Shim
Could you share your zeppelin-env.sh ? 2016년 11월 2일 (수) 오후 4:57, Benoit Hanotte 님이 작성: > Thanks for your reply, > I have tried setting it within zeppelin-env.sh but it doesn't work any > better. > > Thanks > > On Wed, Nov 2, 2016 at 2:13 AM, Hyung Sung Shim wrote: > > Hello. > You should set the

Re: Zeppelin in local computer using yarn on distant cluster

2016-11-02 Thread Benoit Hanotte
Thanks for your reply, I have tried setting it within zeppelin-env.sh but it doesn't work any better. Thanks On Wed, Nov 2, 2016 at 2:13 AM, Hyung Sung Shim wrote: > Hello. > You should set the HADOOP_CONF_DIR to /usr/local/lib/hadoop/etc/hadoop/ > in the conf/zeppelin-env.sh. > Thanks. > 2016년

Re: Zeppelin in local computer using yarn on distant cluster

2016-11-01 Thread Hyung Sung Shim
Hello. You should set the HADOOP_CONF_DIR to /usr/local/lib/hadoop/etc/hadoop/ in the conf/zeppelin-env.sh. Thanks. 2016년 11월 2일 (수) 오전 5:07, Benoit Hanotte 님이 작성: > Hello, > > I'd like to use zeppelin on my local computer and use it to run spark > executors on a distant yarn cluster since I can't

Zeppelin in local computer using yarn on distant cluster

2016-11-01 Thread Benoit Hanotte
Hello, I'd like to use zeppelin on my local computer and use it to run spark executors on a distant yarn cluster since I can't easily install zeppelin on the cluster gateway. I installed the correct hadoop version (2.6), and compiled zeppelin (from the master branch) as following: *mvn clean pac