Weird error using absolute path to run pyspark when using ipython driver

2015-07-27 Thread Zerony Zhao
Hello everyone, Another newbie question. PYSPARK_DRIVER_PYTHON=ipython ./bin/pyspark runs fine, (in $SPARK_HOME) Python 2.7.10 (default, Jul 3 2015, 01:26:20) Type "copyright", "credits" or "license" for more information. IPython 3.2.1 -- An enhanced Interactive Python. ? -> Introducti

Re: PYSPARK_DRIVER_PYTHON="ipython" spark/bin/pyspark Does not create SparkContext

2015-07-27 Thread Zerony Zhao
.org/2/using/cmdline.html#envvar-PYTHONSTARTUP> > /2/using/ > <https://docs.python.org/2/using/cmdline.html#envvar-PYTHONSTARTUP> > cmdline.html#envvar-PYTHONSTARTUP > <https://docs.python.org/2/using/cmdline.html#envvar-PYTHONSTARTUP> > > > > On Sun, Jul 26,

PYSPARK_DRIVER_PYTHON="ipython" spark/bin/pyspark Does not create SparkContext

2015-07-26 Thread Zerony Zhao
Hello everyone, I have a newbie question. $SPARK_HOME/bin/pyspark will create SparkContext automatically. Welcome to __ / __/__ ___ _/ /__ _\ \/ _ \/ _ `/ __/ '_/ /__ / .__/\_,_/_/ /_/\_\ version 1.4.1 /_/ Using Python version 2.7.3 (default, Ju