Perfect. Thanks Josh. I've added myself as a watcher on the ticket.
(By the way, when I upgraded to 2.7 I replaced 2.6 so the executable name
didn't change.)
On Fri, Feb 28, 2014 at 12:12 AM, Josh Rosen wrote:
> There's an open ticket to update the Python version:
> https://spark-project.atlas
There's an open ticket to update the Python version:
https://spark-project.atlassian.net/browse/SPARK-922. In that ticket, I
included instructions for a workaround to manually update a cluster to
Python 2.7.
Did you set the PYSPARK_PYTHON environment variable to the name of your new
Python execut
Makes sense. I'll give it a shot and check back here if that doesn't work.
Are there plans to upgrade the EC2 deployment scripts and/or AMI to have
Python 2.7 by default? If so, is there a ticket somewhere I can follow?
Nick
On Thu, Feb 27, 2014 at 6:50 PM, Bryn Keller wrote:
> Hi Nick,
>
> A
Hi Nick,
All the nodes of the cluster need to have the same Python setup (path and
version). So if, e.g. you start running in 2.7.5 on the master and it ships
code to nodes that have 2.6.x, you'll get invalid opcode errors.
Thanks,
Bryn
On Thu, Feb 27, 2014 at 3:48 PM, nicholas.chammas <
nichol