Hi,
You can try

PYSPARK_DRIVER_PYTHON=/path/to/ipython PYSPARK_DRIVER_PYTHON_OPTS="notebook” 
/path/to//pyspark


/Tomas

> On 11 May 2015, at 22:17, Bin Wang <binwang...@gmail.com> wrote:
> 
> Hey there, 
> 
> I have installed a python interpreter in certain location, say 
> "/opt/local/anaconda". 
> 
> Is there anything that I can specify the Python interpreter while developing 
> in iPython notebook? Maybe a property in the while creating the Sparkcontext?
> 
> 
> I know that I can put "#!/opt/local/anaconda" at the top of my Python code 
> and use spark-submit to distribute it to the cluster. However, since I am 
> using iPython notebook, this is not available as an option. 
> 
> Best,  
> 
> Bin 

Reply via email to