hmm, I think so, please file a ticket for it.


Ruslan Dautkhanov <dautkha...@gmail.com>于2016年12月9日周五 下午1:49写道:

> Hi Jeff,
>
> When I made pySpark as default - it works as expected;
> except Setting UI. See screenshot below.
>
> Notice it shows %spark twice.
> First time as default. 2nd one is not.
> It should have been %pyspark (default), %spark, ..
> as I made pyspark default.
>
> Is this a new bug in 0.7?
>
> [image: Inline image 1]
>
>
> --
> Ruslan Dautkhanov
>
> On Wed, Nov 30, 2016 at 7:34 PM, Jeff Zhang <zjf...@gmail.com> wrote:
>
> Hi Ruslan,
>
> I miss another thing, You also need to delete file conf/interpreter.json
> which store the original setting. Otherwise the original setting is always
> loaded.
>
>
> Ruslan Dautkhanov <dautkha...@gmail.com>于2016年12月1日周四 上午1:03写道:
>
> Got it. Thanks Jeff.
>
> I've downloaded
>
> https://github.com/apache/zeppelin/blob/master/spark/src/main/resources/interpreter-setting.json
> and saved to $ZEPPELIN_HOME/interpreter/spark/
> Then Moved  "defaultInterpreter": true,
> from json section
>     "className": "org.apache.zeppelin.spark.SparkInterpreter",
> to section
>     "className": "org.apache.zeppelin.spark.PySparkInterpreter",
>
> pySpark is still not default.
>
>
>
> --
> Ruslan Dautkhanov
>
> On Tue, Nov 29, 2016 at 10:36 PM, Jeff Zhang <zjf...@gmail.com> wrote:
>
> No, you don't need to create that directory, it should be in
> $ZEPPELIN_HOME/interpreter/spark
>
>
>
>
> Ruslan Dautkhanov <dautkha...@gmail.com>于2016年11月30日周三 下午12:12写道:
>
> Thank you Jeff.
>
> Do I have to create interpreter/spark directory in $ZEPPELIN_HOME/conf
> or in $ZEPPELIN_HOME directory?
> So zeppelin.interpreters in zeppelin-site.xml is deprecated in 0.7?
>
> Thanks!
>
>
>
> --
> Ruslan Dautkhanov
>
> On Tue, Nov 29, 2016 at 6:54 PM, Jeff Zhang <zjf...@gmail.com> wrote:
>
> The default interpreter is now defined in interpreter-setting.json
>
> You can update the following file to make pyspark as the default
> interpreter and then copy it to folder interpreter/spark
>
>
> https://github.com/apache/zeppelin/blob/master/spark/src/main/resources/interpreter-setting.json
>
>
>
> Ruslan Dautkhanov <dautkha...@gmail.com>于2016年11月30日周三 上午8:49写道:
>
> After 0.6.2 -> 0.7 upgrade, pySpark isn't a default Spark interpreter;
> despite we have org.apache.zeppelin.spark.*PySparkInterpreter*
> listed first in zeppelin.interpreters.
>
> zeppelin.interpreters in zeppelin-site.xml:
>
> <property>
>   <name>zeppelin.interpreters</name>
>
> <value>org.apache.zeppelin.spark.PySparkInterpreter,org.apache.zeppelin.spark.SparkInterpreter
> ...
> </property>
>
>
>
> Any ideas how to fix this?
>
>
> Thanks,
> Ruslan
>
>
>
>
>

Reply via email to