It would be great if we supported Python 3 and I'd be happy to review any
pull requests to add it.  I don't know that Python 3 is very widely-used,
but I'm open to supporting it if it won't require too much work.

By the way, we recently added support for PyPy:
https://github.com/apache/spark/pull/2144

- Josh



On Fri, Oct 3, 2014 at 6:44 PM, tomo cocoa <cocoatom...@gmail.com> wrote:

> Hi,
>
> I prefer that PySpark can also be executed on Python 3.
>
> Do you have some reason or demand to use PySpark through Python3?
> If you create an issue on JIRA, I would try to resolve it.
>
>
> On 4 October 2014 06:47, Gen <gen.tan...@gmail.com> wrote:
>
>> According to the official site of spark, for the latest version of
>> spark(1.1.0), it does not work with python 3
>>
>> Spark 1.1.0 works with Python 2.6 or higher (but not Python 3). It uses
>> the
>> standard CPython interpreter, so C libraries like NumPy can be used.
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/pyspark-on-python-3-tp15706p15707.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
>>
>>
>
>
> --
> class Cocoatomo:
>     name = 'cocoatomo'
>     email_address = 'cocoatom...@gmail.com'
>     twitter_id = '@cocoatomo'
>

Reply via email to