It would be great if we supported Python 3 and I'd be happy to review any
pull requests to add it. I don't know that Python 3 is very widely-used,
but I'm open to supporting it if it won't require too much work.
By the way, we recently added support for PyPy:
https://github.com/apache/spark/pull/
Hi,
I prefer that PySpark can also be executed on Python 3.
Do you have some reason or demand to use PySpark through Python3?
If you create an issue on JIRA, I would try to resolve it.
On 4 October 2014 06:47, Gen wrote:
> According to the official site of spark, for the latest version of
> s
According to the official site of spark, for the latest version of
spark(1.1.0), it does not work with python 3
Spark 1.1.0 works with Python 2.6 or higher (but not Python 3). It uses the
standard CPython interpreter, so C libraries like NumPy can be used.
--
View this message in context:
htt