Hi Stuti
2016-03-15 10:08 GMT+01:00 Stuti Awasthi :
> Thanks Prabhu,
>
> I tried starting in local mode but still picking Python 2.6 only. I have
> exported “DEFAULT_PYTHON” in my session variable and also included in PATH.
>
>
>
> Export:
>
> export DEFAULT_PYTHON="/home/stuti/Python/bin/python2
che.org
Subject: Re: Launch Spark shell using differnt python version
Hi Stuti,
You can try local mode but not spark master or yarn mode if python-2.7 is
not installed on all Spark Worker / NodeManager machines. To run with master
mode
1. Check whether user is able to access python2.7
2
Hi Stuti,
You can try local mode but not spark master or yarn mode if python-2.7
is not installed on all Spark Worker / NodeManager machines. To run with
master mode
1. Check whether user is able to access python2.7
2. Check if you have installed python-2.7 in all NodeManager machines
Hi All,
I have a Centos cluster (without any sudo permissions) which has by default
Python 2.6. Now I have installed Python2.7 for my user account and did the
changes in bashrc so that Python2.7 is picked up by default. Then I have set
the following properties in bashrc inorder to launch spark