Re: Configuring Zeppelin spark interpreter to work with different hadoop clusters

2017-06-30 Thread Jeff Zhang
HADOOP_CONF_DIR in zeppelin-env.sh would affect the whole zeppelin instance, and define it in interpreter setting would affect that interpreter. Jeff Zhang 于2017年7月1日周六 上午7:26写道: > > HADOOP_CONF_DIR would affect the whole zeppelin instance. and define it > interpreter setting would affect that in

Re: Configuring Zeppelin spark interpreter to work with different hadoop clusters

2017-06-30 Thread Jeff Zhang
HADOOP_CONF_DIR would affect the whole zeppelin instance. and define it interpreter setting would affect that interpreter. And all the capitalized property name would be taken as env variable. Serega Sheypak 于2017年7月1日周六 上午3:20写道: > hi, thanks for your reply. How should I set this variable? > I'

Re: Configuring Zeppelin spark interpreter to work with different hadoop clusters

2017-06-30 Thread Serega Sheypak
hi, thanks for your reply. How should I set this variable? I'm looking at Spark interpreter config UI. It doesn't allow me to set env variable. https://zeppelin.apache.org/docs/latest/interpreter/spark.html#1-export-spark_home tells that HADOOP_CONF_DIR should be set once per whole Zeppelin instanc

Re: Configuring Zeppelin spark interpreter to work with different hadoop clusters

2017-06-30 Thread Jeff Zhang
Right, create three spark interpreters for your 3 yarn cluster. Serega Sheypak 于2017年6月30日周五 下午10:33写道: > Hi, thanks for your reply! > What do you mean by that? > I can have only one env variable HADOOP_CONF_DIR... > And how can user pick which env to run? > > Or you mean I have to create three

Re: Configuring Zeppelin spark interpreter to work with different hadoop clusters

2017-06-30 Thread Serega Sheypak
Hi, thanks for your reply! What do you mean by that? I can have only one env variable HADOOP_CONF_DIR... And how can user pick which env to run? Or you mean I have to create three Spark interpreters and each of them would have it's own HADOOP_CONF_DIR pointed to single cluster config? 2017-06-30

Re: Configuring Zeppelin spark interpreter to work with different hadoop clusters

2017-06-30 Thread Jeff Zhang
Try set HADOOP_CONF_DIR for each yarn conf in interpreter setting. Serega Sheypak 于2017年6月30日周五 下午10:11写道: > Hi I have several different hadoop clusters, each of them has it's own > YARN. > Is it possible to configure single Zeppelin instance to work with > different clusters? > I want to run spa

Configuring Zeppelin spark interpreter to work with different hadoop clusters

2017-06-30 Thread Serega Sheypak
Hi I have several different hadoop clusters, each of them has it's own YARN. Is it possible to configure single Zeppelin instance to work with different clusters? I want to run spark on cluster A if data is there. Right now my Zeppelin runs on single cluster and it sucks data from remote clusters w