Hi everyone,
Considering the python API as just a front needing the SPARK_HOME defined
anyway, I think it would be interesting to deploy the Python part of Spark
on PyPi in order to handle the dependencies in a Python project needing
PySpark via pip.

For now I just symlink the python/pyspark in my python install dir
site-packages/ in order for PyCharm or other lint tools to work properly.
I can do the setup.py work or anything.

What do you think ?

Regards,

Olivier.

Reply via email to