The default interpreter is now defined in interpreter-setting.json

You can update the following file to make pyspark as the default
interpreter and then copy it to folder interpreter/spark

https://github.com/apache/zeppelin/blob/master/spark/src/main/resources/interpreter-setting.json



Ruslan Dautkhanov <dautkha...@gmail.com>于2016年11月30日周三 上午8:49写道:

> After 0.6.2 -> 0.7 upgrade, pySpark isn't a default Spark interpreter;
> despite we have org.apache.zeppelin.spark.*PySparkInterpreter*
> listed first in zeppelin.interpreters.
>
> zeppelin.interpreters in zeppelin-site.xml:
>
> <property>
>   <name>zeppelin.interpreters</name>
>
> <value>org.apache.zeppelin.spark.PySparkInterpreter,org.apache.zeppelin.spark.SparkInterpreter
> ...
> </property>
>
>
>
> Any ideas how to fix this?
>
>
> Thanks,
> Ruslan
>

Reply via email to