Got it. Thanks Jeff.

I've downloaded
https://github.com/apache/zeppelin/blob/master/spark/src/main/resources/
interpreter-setting.json
and saved to $ZEPPELIN_HOME/interpreter/spark/
Then Moved  "defaultInterpreter": true,
from json section
    "className": "org.apache.zeppelin.spark.SparkInterpreter",
to section
    "className": "org.apache.zeppelin.spark.PySparkInterpreter",

pySpark is still not default.



-- 
Ruslan Dautkhanov

On Tue, Nov 29, 2016 at 10:36 PM, Jeff Zhang <zjf...@gmail.com> wrote:

> No, you don't need to create that directory, it should be in
> $ZEPPELIN_HOME/interpreter/spark
>
>
>
>
> Ruslan Dautkhanov <dautkha...@gmail.com>于2016年11月30日周三 下午12:12写道:
>
>> Thank you Jeff.
>>
>> Do I have to create interpreter/spark directory in $ZEPPELIN_HOME/conf
>> or in $ZEPPELIN_HOME directory?
>> So zeppelin.interpreters in zeppelin-site.xml is deprecated in 0.7?
>>
>> Thanks!
>>
>>
>>
>> --
>> Ruslan Dautkhanov
>>
>> On Tue, Nov 29, 2016 at 6:54 PM, Jeff Zhang <zjf...@gmail.com> wrote:
>>
>> The default interpreter is now defined in interpreter-setting.json
>>
>> You can update the following file to make pyspark as the default
>> interpreter and then copy it to folder interpreter/spark
>>
>> https://github.com/apache/zeppelin/blob/master/spark/src/main/resources/
>> interpreter-setting.json
>>
>>
>>
>> Ruslan Dautkhanov <dautkha...@gmail.com>于2016年11月30日周三 上午8:49写道:
>>
>> After 0.6.2 -> 0.7 upgrade, pySpark isn't a default Spark interpreter;
>> despite we have org.apache.zeppelin.spark.*PySparkInterpreter*
>> listed first in zeppelin.interpreters.
>>
>> zeppelin.interpreters in zeppelin-site.xml:
>>
>> <property>
>>   <name>zeppelin.interpreters</name>
>>   <value>org.apache.zeppelin.spark.PySparkInterpreter,org.
>> apache.zeppelin.spark.SparkInterpreter
>> ...
>> </property>
>>
>>
>>
>> Any ideas how to fix this?
>>
>>
>> Thanks,
>> Ruslan
>>
>>
>>

Reply via email to