Hi,
I haven't spent a lot of time working on the python side of spark before so
apologize if this is a basic question, but I'm trying to figure out the
best way to run a small subset of python tests in a tight loop while
developing. The closer I can get to sbt's "~test-only *FooSuite -- -z
test-b
Actually Python 3.7 is released (
https://www.python.org/downloads/release/python-370/) too and I fixed the
compatibility issues accordingly -
https://github.com/apache/spark/pull/21714
There has been an issue for 3.6 (comparing to lower versions of Python
including 3.5) - https://github.com/apache
SGTM too
2018년 8월 12일 (일) 오전 7:41, shane knapp 님이 작성:
> they do seem like real failures on branches 2.0 and 2.1.
>
> regarding infrastructure, centos and ubuntu have lintr pinned to
> 1.0.1.9000, and installed via:
> devtools::install_github('jimhester/lintr@5431140')
>
> builds on branches 2.2+
There's informal way to test specific tests. For instance:
SPARK_TESTING=1 ../bin/pyspark pyspark.sql.tests VectorizedUDFTests
I have a partial fix for our testing script to support this way in my local
but couldn't have enough time to make a PR for it yet.
2018년 8월 20일 (월) 오전 11:08, Imran Rash