Please vote on releasing the following candidate as Apache Spark version
3.0.0-preview.
The vote is open until November 3 PST and passes if a majority +1 PMC votes
are cast, with
a minimum of 3 +1 votes.
[ ] +1 Release this package as Apache Spark 3.0.0-preview
[ ] -1 Do not release this package
The Ganglia module has only 2 files.
In addition to dropping, we may choose the following two ways to support it
still partially
like `kafka-0.8` which Apache Spark supports in Scala 2.11 only.
1. We can stick to `dropwizard 3.x` for JDK8 (by default) and use
`dropwizard 4.x` for `hadoop-3.2` p
I wanted to raise this to dev@.
So, updating dropwizard metrics from 3.2.x to 4.x might be important for
JDK 11 support. Our tests pass as-is without this update. But we don't test
some elements of this metrics support, like Ganglia integration. And I have
heard reports that downstream custom usag
I was trying to avoid changing the version names and revert the changes on
master again. But you are right it might lead to confusions which release
script is used for RC2, I'll follow your advice and create a new RC2 tag.
Thanks!
Xingbo
On Wed, Oct 30, 2019 at 5:06 PM Dongjoon Hyun
wrote:
> H
Hi, Xingbo.
Currently, RC2 tag is pointing RC1 tag.
https://github.com/apache/spark/tree/v3.0.0-preview-rc2
Could you cut from the HEAD of master branch?
Otherwise, nobody knows what release script you used for RC2.
Bests,
Dongjoon.
On Wed, Oct 30, 2019 at 4:15 PM Xingbo Jiang wrote:
> Hi
Hi all,
This RC fails because:
It fails to generate a PySpark release.
I'll start RC2 soon.
Thanks!
Xingbo
On Wed, Oct 30, 2019 at 4:10 PM Xingbo Jiang wrote:
> Thanks Sean, since we need to generate PySpark release with a different
> name, I would prefer fail RC1 and start another release
I agree that we need a Pyspark release for this preview release. If
it's a matter of producing it from the same tag, we can evaluate it
within this same release candidate. Otherwise, just roll another
release candidate.
I was able to build it and pass all tests with JDK 8 and JDK 11
(hadoop-3.2 pr
Thanks Sean, since we need to generate PySpark release with a different
name, I would prefer fail RC1 and start another release candidate.
Sean Owen 于2019年10月30日周三 下午4:00写道:
> I agree that we need a Pyspark release for this preview release. If
> it's a matter of producing it from the same tag, w
I don't agree with this take. The bottleneck is pretty much not Spark
-- it is all of its dependencies, and there are unfortunately a lot.
For example, Chill (among other things) doesn't support 2.13 yet. I
don't think 2.13 is that 'mainstream' yet. We are not close to Scala
2.13 support, so it won
scala 2.13 support is tracked by
https://issues.apache.org/jira/browse/SPARK-25075 , at the current time
there are still major issues remaining, thus we don't include scala 2.13
support in the 3.0.0-preview release.
If the task is finished before the code freeze of Spark 3.0.0, then it's
still poss
Why not trying the current Scala (2.13)? Spark has always been one (sometimes
- two) Scala versions away from the whole Scala ecosystem and it has always
been a big pain point for everybody. I understand that in the past you could
not switch because of compatibility issues, but 3.x is a major versi
sure. that shouldn't be too hard, but we've historically given very little
support to it.
On Wed, Oct 30, 2019 at 2:31 PM Maciej Szymkiewicz
wrote:
> Could we upgrade to PyPy3.6 v7.2.0?
> On 10/30/19 9:45 PM, Shane Knapp wrote:
>
> one quick thing: we currently test against python2.7, 3.6 *and
Could we upgrade to PyPy3.6 v7.2.0?
On 10/30/19 9:45 PM, Shane Knapp wrote:
> one quick thing: we currently test against python2.7, 3.6 *and*
> pypy2.5.1 (python2.7).
>
> what are our plans for pypy?
>
>
> On Wed, Oct 30, 2019 at 12:26 PM Dongjoon Hyun
> mailto:dongjoon.h...@gmail.com>> wrote:
>
also, here's my PR for dropping 2.7 tests:
https://github.com/apache/spark/pull/26330
On Wed, Oct 30, 2019 at 1:45 PM Shane Knapp wrote:
> one quick thing: we currently test against python2.7, 3.6 *and* pypy2.5.1
> (python2.7).
>
> what are our plans for pypy?
>
>
> On Wed, Oct 30, 2019 at 12:2
one quick thing: we currently test against python2.7, 3.6 *and* pypy2.5.1
(python2.7).
what are our plans for pypy?
On Wed, Oct 30, 2019 at 12:26 PM Dongjoon Hyun
wrote:
> Thank you all. I made a PR for that.
>
> https://github.com/apache/spark/pull/26326
>
> On Tue, Oct 29, 2019 at 5:45 AM T
Thank you all. I made a PR for that.
https://github.com/apache/spark/pull/26326
On Tue, Oct 29, 2019 at 5:45 AM Takeshi Yamamuro
wrote:
> +1, too.
>
> On Tue, Oct 29, 2019 at 4:16 PM Holden Karau wrote:
>
>> +1 to deprecating but not yet removing support for 3.6
>>
>> On Tue, Oct 29, 2019 at 3
16 matches
Mail list logo