We already test CPython 2.6, CPython 3.4 and PyPy 2.5, it took more
than 30 min to run (without parallelization),
I think it should be enough.
PyPy 2.2 is too old that we have not enough resource to support that.
On Fri, Nov 6, 2015 at 2:27 AM, Chang Ya-Hsuan wrote:
> Hi I run ./python/ru-tests
Hi I run ./python/ru-tests to test following modules of spark-1.5.1:
[pyspark-core', 'pyspark-ml', 'pyspark-mllib', 'pyspark-sql',
'pyspark-streaming]
against to following pypy versions:
pypy-2.2.1 pypy-2.3 pypy-2.3.1 pypy-2.4.0 pypy-2.5.0 pypy-2.5.1
pypy-2.6.0 pypy-2.6.1 pypy-4.0.0
exc
You could try running PySpark's own unit tests. Try ./python/run-tests
--help for instructions.
On Thu, Nov 5, 2015 at 12:31 AM Chang Ya-Hsuan wrote:
> I've test on following pypy version against to spark-1.5.1
>
> pypy-2.2.1
> pypy-2.3
> pypy-2.3.1
> pypy-2.4.0
> pypy-2.5.0
> pypy-2
I've test on following pypy version against to spark-1.5.1
pypy-2.2.1
pypy-2.3
pypy-2.3.1
pypy-2.4.0
pypy-2.5.0
pypy-2.5.1
pypy-2.6.0
pypy-2.6.1
I run
$ PYSPARK_PYTHON=/path/to/pypy-xx.xx/bin/pypy
/path/to/spark-1.5.1/bin/pyspark
and only pypy-2.2.1 failed.
Any suggestion t
Thanks for your quickly reply.
I will test several pypy versions and report the result later.
On Thu, Nov 5, 2015 at 4:06 PM, Josh Rosen wrote:
> I noticed that you're using PyPy 2.2.1, but it looks like Spark 1.5.1's
> docs say that we only support PyPy 2.3+. Could you try using a newer PyPy
>
I noticed that you're using PyPy 2.2.1, but it looks like Spark 1.5.1's
docs say that we only support PyPy 2.3+. Could you try using a newer PyPy
version to see if that works?
I just checked and it looks like our Jenkins tests are running against PyPy
2.5.1, so that version is known to work. I'm n
Hi all,
I am trying to run pyspark with pypy, and it is work when using spark-1.3.1
but failed when using spark-1.4.1 and spark-1.5.1
my pypy version:
$ /usr/bin/pypy --version
Python 2.7.3 (2.2.1+dfsg-1ubuntu0.3, Sep 30 2015, 15:18:40)
[PyPy 2.2.1 with GCC 4.8.4]
works with spark-1.3.1
$ PYSP