Hi, Yuming.

Could you summarize the vote result?

Bests,
Dongjoon.

On Wed, Dec 18, 2019 at 19:28 Wenchen Fan <cloud0...@gmail.com> wrote:

> +1, all tests pass
>
> On Thu, Dec 19, 2019 at 7:18 AM Takeshi Yamamuro <linguin....@gmail.com>
> wrote:
>
>> Thanks, Yuming!
>>
>> I checked the links and the prepared binaries.
>> Also, I run tests with  -Pyarn -Phadoop-2.7 -Phive -Phive-thriftserver
>> -Pmesos -Pkubernetes -Psparkr
>> on java version "1.8.0_181.
>> All the things above look fine.
>>
>> Bests,
>> Takeshi
>>
>> On Thu, Dec 19, 2019 at 6:31 AM Dongjoon Hyun <dongjoon.h...@gmail.com>
>> wrote:
>>
>>> +1
>>>
>>> I also check the signatures and docs. And, built and tested with JDK
>>> 11.0.5, Hadoop 3.2, Hive 2.3.
>>> In addition, the newly added
>>> `spark-3.0.0-preview2-bin-hadoop2.7-hive1.2.tgz` distribution looks correct.
>>>
>>> Thank you Yuming and all.
>>>
>>> Bests,
>>> Dongjoon.
>>>
>>>
>>> On Tue, Dec 17, 2019 at 4:11 PM Sean Owen <sro...@apache.org> wrote:
>>>
>>>> Same result as last time. It all looks good and tests pass for me on
>>>> Ubuntu with all profiles enables (Hadoop 3.2 + Hive 2.3), building
>>>> from source.
>>>> 'pyspark-3.0.0.dev2.tar.gz' appears to be the desired python artifact
>>>> name, yes.
>>>> +1
>>>>
>>>> On Tue, Dec 17, 2019 at 12:36 AM Yuming Wang <wgy...@gmail.com> wrote:
>>>> >
>>>> > Please vote on releasing the following candidate as Apache Spark
>>>> version 3.0.0-preview2.
>>>> >
>>>> > The vote is open until December 20 PST and passes if a majority +1
>>>> PMC votes are cast, with
>>>> > a minimum of 3 +1 votes.
>>>> >
>>>> > [ ] +1 Release this package as Apache Spark 3.0.0-preview2
>>>> > [ ] -1 Do not release this package because ...
>>>> >
>>>> > To learn more about Apache Spark, please see http://spark.apache.org/
>>>> >
>>>> > The tag to be voted on is v3.0.0-preview2-rc2 (commit
>>>> bcadd5c3096109878fe26fb0d57a9b7d6fdaa257):
>>>> > https://github.com/apache/spark/tree/v3.0.0-preview2-rc2
>>>> >
>>>> > The release files, including signatures, digests, etc. can be found
>>>> at:
>>>> > https://dist.apache.org/repos/dist/dev/spark/v3.0.0-preview2-rc2-bin/
>>>> >
>>>> > Signatures used for Spark RCs can be found in this file:
>>>> > https://dist.apache.org/repos/dist/dev/spark/KEYS
>>>> >
>>>> > The staging repository for this release can be found at:
>>>> >
>>>> https://repository.apache.org/content/repositories/orgapachespark-1338/
>>>> >
>>>> > The documentation corresponding to this release can be found at:
>>>> >
>>>> https://dist.apache.org/repos/dist/dev/spark/v3.0.0-preview2-rc2-docs/
>>>> >
>>>> > The list of bug fixes going into 3.0.0 can be found at the following
>>>> URL:
>>>> > https://issues.apache.org/jira/projects/SPARK/versions/12339177
>>>> >
>>>> > FAQ
>>>> >
>>>> > =========================
>>>> > How can I help test this release?
>>>> > =========================
>>>> >
>>>> > If you are a Spark user, you can help us test this release by taking
>>>> > an existing Spark workload and running on this release candidate, then
>>>> > reporting any regressions.
>>>> >
>>>> > If you're working in PySpark you can set up a virtual env and install
>>>> > the current RC and see if anything important breaks, in the Java/Scala
>>>> > you can add the staging repository to your projects resolvers and test
>>>> > with the RC (make sure to clean up the artifact cache before/after so
>>>> > you don't end up building with an out of date RC going forward).
>>>> >
>>>> > ===========================================
>>>> > What should happen to JIRA tickets still targeting 3.0.0?
>>>> > ===========================================
>>>> >
>>>> > The current list of open tickets targeted at 3.0.0 can be found at:
>>>> > https://issues.apache.org/jira/projects/SPARK and search for "Target
>>>> Version/s" = 3.0.0
>>>> >
>>>> > Committers should look at those and triage. Extremely important bug
>>>> > fixes, documentation, and API tweaks that impact compatibility should
>>>> > be worked on immediately.
>>>> >
>>>> > ==================
>>>> > But my bug isn't fixed?
>>>> > ==================
>>>> >
>>>> > In order to make timely releases, we will typically not hold the
>>>> > release unless the bug in question is a regression from the previous
>>>> > release. That being said, if there is something which is a regression
>>>> > that has not been correctly targeted please ping me or a committer to
>>>> > help target the issue.
>>>>
>>>> ---------------------------------------------------------------------
>>>> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>>>>
>>>>
>>
>> --
>> ---
>> Takeshi Yamamuro
>>
>

Reply via email to