Hi,

I have been using Zeppelin for quite a while without issues.
Lately, I have been trying to configure pyspark, but I can't seem to make
it work.
Using the local pyspark works perfectly, but then, regardless of the
PYTHONPATH I specify in zeppelin-env.sh, any usage of pyspark results in:

Error from python worker:
  /usr/bin/python: No module named pyspark
PYTHONPATH was:
  /usr/lib/spark/python/lib/pyspark.zip:/usr/lib/spark/python/
lib/py4j-0.9-src.zip:/usr/lib/spark/lib/spark-assembly-1.6.1-hadoop2.6.0.jar


Now, that pyspark.zip, is actually not there (my distribution have a tar.gz
inside /usr/lib/spark/lib), but no matter what I set, the PYTHONPATH does
not change.

Any idea?

Regards,
Stefano

Reply via email to