Hey,

Depends on your configuration. I configure my dockerfile with spark2.0
installed and in pycharm, properly configure the interpreter using docker
and add following env in your script configuration. You can check the
dockerfile here: https://github.com/zhangxuan1918/spark2.0

PYSPARK_PYTHON  /usr/bin/python
PYSPARK_DRIVER_PYTHON   /usr/bin/python
PYTHONPATH  
/usr/spark/python:/usr/spark/python/lib/py4j-0.10.1-src.zip:$PYTHONPATH
SPARK_HOME  /usr/spark



On 3 Mar 2017, at 16:11, Sidney Feiner <sidney.fei...@startapp.com> wrote:

Hey,
I once found an article about that:
https://mengdong.github.io/2016/08/08/fully-armed-pyspark-with-ipython-and-
jupyter/

And I once managed to set it up on Pycharm as well. What I had to do was to
add /path/to/spark to a system variable called "PYTHTONPATH".
Try that one, it might help J

*From:* Anahita Talebi [mailto:anahita.t.am...@gmail.com]
*Sent:* Friday, March 3, 2017 5:05 PM
*To:* Pushkar.Gujar <pushkarvgu...@gmail.com>
*Cc:* User <user@spark.apache.org>
*Subject:* Re: How to run a spark on Pycharm


Hi,

Thanks for your answer.

Sorry, I am completely beginner in running the code in spark.
Could you please tell me a bit more in details how to do that?

I installed ipython and Jupyter notebook on my local machine. But how can I
run the code using them? Before, I tried to run the code with Pycharm that
I was failed.
Thanks,
Anahita

On Fri, Mar 3, 2017 at 3:48 PM, Pushkar.Gujar <pushkarvgu...@gmail.com>
wrote:

Jupyter notebook/ipython can be connected to apache spark


Thank you,
*Pushkar Gujar*


On Fri, Mar 3, 2017 at 9:43 AM, Anahita Talebi <anahita.t.am...@gmail.com>
wrote:

Hi everyone,

I am trying to run a spark code on Pycharm. I tried to give the path of
spark as a environment variable to the configuration of Pycharm.
Unfortunately, I get the error. Does anyone know how I can run the spark
code on Pycharm?
It shouldn't be necessarily on Pycharm. if you know any other software, It
would be nice to tell me.
Thanks a lot,
Anahita

Reply via email to