Thank you so much.

I found the issue. My fault, the stock ipython version 0.12.1 is too old,
which does not support PYTHONSTARTUP. Upgrading ipython solved the issue.

On Mon, Jul 27, 2015 at 12:43 PM, <felixcheun...@hotmail.com> wrote:

>  Hmm, it should work with you run `PYSPARK_DRIVER_PYTHON="ipython"
> spark/bin/pyspark`
>
> PYTHONSTARTUP is a PYTHON environment variable
>
> https <https://docs.python.org/2/using/cmdline.html#envvar-PYTHONSTARTUP>
> :// <https://docs.python.org/2/using/cmdline.html#envvar-PYTHONSTARTUP>
> docs.python.org
> <https://docs.python.org/2/using/cmdline.html#envvar-PYTHONSTARTUP>
> /2/using/
> <https://docs.python.org/2/using/cmdline.html#envvar-PYTHONSTARTUP>
> cmdline.html#envvar-PYTHONSTARTUP
> <https://docs.python.org/2/using/cmdline.html#envvar-PYTHONSTARTUP>
>
>
>
> On Sun, Jul 26, 2015 at 4:06 PM -0700, "Zerony Zhao" <bw.li...@gmail.com>
> wrote:
>
>     Hello everyone,
>
>  I have a newbie question.
>
>  $SPARK_HOME/bin/pyspark will create SparkContext automatically.
>
> Welcome to
>       ____              __
>      / __/__  ___ _____/ /__
>     _\ \/ _ \/ _ `/ __/  '_/
>    /__ / .__/\_,_/_/ /_/\_\   version 1.4.1
>       /_/
>
> Using Python version 2.7.3 (default, Jun 22 2015 19:33:41)
> SparkContext available as sc, HiveContext available as sqlContext.
>
>
>  But When using ipython as a driver,
>
> PYSPARK_DRIVER_PYTHON="ipython" spark/bin/pyspark
>
>  , does not create SparkContext automatically. I have to execute
>
> execfile('spark_home/python/pyspark/shell.py')
>
>  is it by design?
>
>  I read the bash script bin/pyspark, I noticed the line:
>
> export PYTHONSTARTUP="$SPARK_HOME/python/pyspark/shell.py"
>
>  But I searched the whole spark source code, the variable PYTHONSTARTUP
> is never used, I could not understand when PYTHONSTARTUP is executed.
>
>  Thank you.
>

Reply via email to