The Apache Software Foundation requires voting before any release can be
published.
On Tue, Mar 31, 2020 at 11:27 PM, Stephen Coy < s...@infomedia.com.au.invalid >
wrote:
>
>
>> On 1 Apr 2020, at 5:20 pm, Sean Owen < srowen@ gmail. com (
>> sro...@gmail.com ) > wrote:
>>
>> It can be publish
On 1 Apr 2020, at 5:20 pm, Sean Owen
mailto:sro...@gmail.com>> wrote:
It can be published as "3.0.0-rc1" but how do we test that to vote on it
without some other RC1 RC1
I’m not sure what you mean by this question?
This email contains confidential information of and is the copyright of
Info
You just mvn -DskipTests install the source release. That is the primary
artifact we're testing. But yes you could put the jars in your local repo
too.
I think this is pretty standard practice. Obviously the RC can't be
published as "3.0.0". It can be published as "3.0.0-rc1" but how do we test
tha
Therefore, if I want to build my product against these jars I need to either
locally install these jars or checkout and build the RC tag.
I guess I need to build anyway because I need a
spark-hadoop-cloud_2.12-3.0.0.jar. BTW, it would be incredibly handy to have
this in the distro, or at least
-1 (non-binding)
I filed SPARK-31257 as a blocker, and now others start to agree that it's a
critical issue which should be dealt before releasing Spark 3.0. Please
refer recent comments in https://github.com/apache/spark/pull/28026
It won't delay the release pretty much, as we can either revert
That is a very unusual practice...
On 1 Apr 2020, at 3:32 pm, Sean Owen
mailto:sro...@gmail.com>> wrote:
These are release candidates, not the final release, so they won't be published
to Maven Central. The naming matches what the final release would be.
On Tue, Mar 31, 2020 at 11:25 PM Stephe
Yea, release candidates are different from the preview version, as release
candidates are not official releases, so they won't appear in Maven
Central, can't be downloaded in the Spark official website, etc.
On Wed, Apr 1, 2020 at 12:32 PM Sean Owen wrote:
> These are release candidates, not the
These are release candidates, not the final release, so they won't be
published to Maven Central. The naming matches what the final release would
be.
On Tue, Mar 31, 2020 at 11:25 PM Stephen Coy
wrote:
> Furthermore, the spark jars in these bundles all look like release
> versions:
>
> [scoy@Ste
Furthermore, the spark jars in these bundles all look like release versions:
[scoy@Steves-Core-i9 spark-3.0.0-bin-hadoop3.2]$ ls -l jars/spark-*
-rw-r--r--@ 1 scoy staff 9261223 31 Mar 20:55
jars/spark-catalyst_2.12-3.0.0.jar
-rw-r--r--@ 1 scoy staff 9720421 31 Mar 20:55 jars/spark-core_2.12-
The download artifacts are all seem to have the “RC1” missing from their names.
e.g. spark-3.0.0-bin-hadoop3.2.tgz
Cheers,
Steve C
On 1 Apr 2020, at 2:04 pm, Reynold Xin
mailto:r...@databricks.com>> wrote:
Please vote on releasing the following candidate as Apache Spark version 3.0.0.
The v
Please vote on releasing the following candidate as Apache Spark version 3.0.0.
The vote is open until 11:59pm Pacific time Fri Apr 3 , and passes if a
majority +1 PMC votes are cast, with a minimum of 3 +1 votes.
[ ] +1 Release this package as Apache Spark 3.0.0
[ ] -1 Do not release this pack
Hi, Dongjoon,
You can backport the commits from master to 3.0, as long as they follow our
code freeze policy. Feel free to -1 on RC1 vote if your backported PRs are
blocking the release.
Cheers,
Xiao
On Tue, Mar 31, 2020 at 9:28 AM Dongjoon Hyun
wrote:
> Hi, All.
>
> RC1 tag was created yest
Hi, All.
RC1 tag was created yesterday and traditionally we hold on all backporting
activities to give some time to a release manager. I'm also holding two
commits at master branch.
https://github.com/apache/spark/tree/v3.0.0-rc1
However, I'm still seeing some commits land on `branch-3.0`.
13 matches
Mail list logo