Does anybody here care about us dropping support for Python 2.6 in Spark
2.0?

Python 2.6 is ancient, and is pretty slow in many aspects (e.g. json
parsing) when compared with Python 2.7. Some libraries that Spark depend on
stopped supporting 2.6. We can still convince the library maintainers to
support 2.6, but it will be extra work. I'm curious if anybody still uses
Python 2.6 to run Spark.

Thanks.

Reply via email to