Re: Can not connect to a remote spark master

2018-10-22 Thread Alex Dzhagriev
ter settings, > and if it does not find it, it will look for it on zeppelin environment > variables; so you can specify it in both sides, but as it does not change > frenquently it is better on zeppelin environment variable. > > El sáb., 20 oct. 2018 a las 0:25, Alex Dzhagriev () >

Re: spark hive metastore setting is ignored in 0.8.0 docker

2018-10-19 Thread Alex Dzhagriev
I set it as the spark interpreter configuration value. I also have hive-site.xml on the following path: zeppelin/conf/hive-site.xml. Thanks, Alex. On Fri, Oct 19, 2018 at 4:58 PM Jeff Zhang wrote: > Usually hive.metastore.uris is set in hive-site.xml, where do you set it ? > > > Al

Re: Can not connect to a remote spark master

2018-10-19 Thread Alex Dzhagriev
Thanks for the quick reply. Should I specify it to the Zeppelin process or the Spark interpreter? Thanks, Alex. On Fri, Oct 19, 2018 at 4:53 PM Jeff Zhang wrote: > You need to specify SPARK_HOME which is where spark installed. > > > Alex Dzhagriev 于2018年10月20日周六 上午3:12写道: > &g

Can not connect to a remote spark master

2018-10-19 Thread Alex Dzhagriev
Hello, I have a remote Spark cluster and I'm trying to use it by setting the spark interpreter property: master spark://spark-cluster-master:7077, however I'm getting the following error: java.lang.RuntimeException: SPARK_HOME is not specified in interpreter-setting for non-local mode, if you sp

spark hive metastore setting is ignored in 0.8.0 docker

2018-10-19 Thread Alex Dzhagriev
Hello, I have set the "hive.metastore.uris" config value to use a remote metastore in the spark interpreter, however the local megastore is used. That worked fine on the previous version 0.7.3 and works if I uncheck the "zeppelin.spark.useNew" setting. Thanks, Alex.