+1 all our pipelines have been running the RC for several days now.

On Mon, Feb 26, 2018 at 10:33 AM, Dongjoon Hyun <dongjoon.h...@gmail.com>
wrote:

> +1 (non-binding).
>
> Bests,
> Dongjoon.
>
>
>
> On Mon, Feb 26, 2018 at 9:14 AM, Ryan Blue <rb...@netflix.com.invalid>
> wrote:
>
>> +1 (non-binding)
>>
>> On Sat, Feb 24, 2018 at 4:17 PM, Xiao Li <gatorsm...@gmail.com> wrote:
>>
>>> +1 (binding) in Spark SQL, Core and PySpark.
>>>
>>> Xiao
>>>
>>> 2018-02-24 14:49 GMT-08:00 Ricardo Almeida <ricardo.alme...@actnowib.com
>>> >:
>>>
>>>> +1 (non-binding)
>>>>
>>>> same as previous RC
>>>>
>>>> On 24 February 2018 at 11:10, Hyukjin Kwon <gurwls...@gmail.com> wrote:
>>>>
>>>>> +1
>>>>>
>>>>> 2018-02-24 16:57 GMT+09:00 Bryan Cutler <cutl...@gmail.com>:
>>>>>
>>>>>> +1
>>>>>> Tests passed and additionally ran Arrow related tests and did some
>>>>>> perf checks with python 2.7.14
>>>>>>
>>>>>> On Fri, Feb 23, 2018 at 6:18 PM, Holden Karau <hol...@pigscanfly.ca>
>>>>>> wrote:
>>>>>>
>>>>>>> Note: given the state of Jenkins I'd love to see Bryan Cutler or
>>>>>>> someone with Arrow experience sign off on this release.
>>>>>>>
>>>>>>> On Fri, Feb 23, 2018 at 6:13 PM, Cheng Lian <lian.cs....@gmail.com>
>>>>>>> wrote:
>>>>>>>
>>>>>>>> +1 (binding)
>>>>>>>>
>>>>>>>> Passed all the tests, looks good.
>>>>>>>>
>>>>>>>> Cheng
>>>>>>>>
>>>>>>>> On 2/23/18 15:00, Holden Karau wrote:
>>>>>>>>
>>>>>>>> +1 (binding)
>>>>>>>> PySpark artifacts install in a fresh Py3 virtual env
>>>>>>>>
>>>>>>>> On Feb 23, 2018 7:55 AM, "Denny Lee" <denny.g....@gmail.com> wrote:
>>>>>>>>
>>>>>>>>> +1 (non-binding)
>>>>>>>>>
>>>>>>>>> On Fri, Feb 23, 2018 at 07:08 Josh Goldsborough <
>>>>>>>>> joshgoldsboroughs...@gmail.com> wrote:
>>>>>>>>>
>>>>>>>>>> New to testing out Spark RCs for the community but I was able to
>>>>>>>>>> run some of the basic unit tests without error so for what it's 
>>>>>>>>>> worth, I'm
>>>>>>>>>> a +1.
>>>>>>>>>>
>>>>>>>>>> On Thu, Feb 22, 2018 at 4:23 PM, Sameer Agarwal <
>>>>>>>>>> samee...@apache.org> wrote:
>>>>>>>>>>
>>>>>>>>>>> Please vote on releasing the following candidate as Apache Spark
>>>>>>>>>>> version 2.3.0. The vote is open until Tuesday February 27, 2018 at 
>>>>>>>>>>> 8:00:00
>>>>>>>>>>> am UTC and passes if a majority of at least 3 PMC +1 votes are cast.
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> [ ] +1 Release this package as Apache Spark 2.3.0
>>>>>>>>>>>
>>>>>>>>>>> [ ] -1 Do not release this package because ...
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> To learn more about Apache Spark, please see
>>>>>>>>>>> https://spark.apache.org/
>>>>>>>>>>>
>>>>>>>>>>> The tag to be voted on is v2.3.0-rc5:
>>>>>>>>>>> https://github.com/apache/spark/tree/v2.3.0-rc5
>>>>>>>>>>> (992447fb30ee9ebb3cf794f2d06f4d63a2d792db)
>>>>>>>>>>>
>>>>>>>>>>> List of JIRA tickets resolved in this release can be found here:
>>>>>>>>>>> https://issues.apache.org/jira/projects/SPARK/versions/12339551
>>>>>>>>>>>
>>>>>>>>>>> The release files, including signatures, digests, etc. can be
>>>>>>>>>>> found at:
>>>>>>>>>>> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc5-bin/
>>>>>>>>>>>
>>>>>>>>>>> Release artifacts are signed with the following key:
>>>>>>>>>>> https://dist.apache.org/repos/dist/dev/spark/KEYS
>>>>>>>>>>>
>>>>>>>>>>> The staging repository for this release can be found at:
>>>>>>>>>>> https://repository.apache.org/content/repositories/orgapache
>>>>>>>>>>> spark-1266/
>>>>>>>>>>>
>>>>>>>>>>> The documentation corresponding to this release can be found at:
>>>>>>>>>>> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc5-docs
>>>>>>>>>>> /_site/index.html
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> FAQ
>>>>>>>>>>>
>>>>>>>>>>> =======================================
>>>>>>>>>>> What are the unresolved issues targeted for 2.3.0?
>>>>>>>>>>> =======================================
>>>>>>>>>>>
>>>>>>>>>>> Please see https://s.apache.org/oXKi. At the time of writing,
>>>>>>>>>>> there are currently no known release blockers.
>>>>>>>>>>>
>>>>>>>>>>> =========================
>>>>>>>>>>> How can I help test this release?
>>>>>>>>>>> =========================
>>>>>>>>>>>
>>>>>>>>>>> If you are a Spark user, you can help us test this release by
>>>>>>>>>>> taking an existing Spark workload and running on this release 
>>>>>>>>>>> candidate,
>>>>>>>>>>> then reporting any regressions.
>>>>>>>>>>>
>>>>>>>>>>> If you're working in PySpark you can set up a virtual env and
>>>>>>>>>>> install the current RC and see if anything important breaks, in the
>>>>>>>>>>> Java/Scala you can add the staging repository to your projects 
>>>>>>>>>>> resolvers
>>>>>>>>>>> and test with the RC (make sure to clean up the artifact cache 
>>>>>>>>>>> before/after
>>>>>>>>>>> so you don't end up building with a out of date RC going forward).
>>>>>>>>>>>
>>>>>>>>>>> ===========================================
>>>>>>>>>>> What should happen to JIRA tickets still targeting 2.3.0?
>>>>>>>>>>> ===========================================
>>>>>>>>>>>
>>>>>>>>>>> Committers should look at those and triage. Extremely important
>>>>>>>>>>> bug fixes, documentation, and API tweaks that impact compatibility 
>>>>>>>>>>> should
>>>>>>>>>>> be worked on immediately. Everything else please retarget to 2.3.1 
>>>>>>>>>>> or 2.4.0
>>>>>>>>>>> as appropriate.
>>>>>>>>>>>
>>>>>>>>>>> ===================
>>>>>>>>>>> Why is my bug not fixed?
>>>>>>>>>>> ===================
>>>>>>>>>>>
>>>>>>>>>>> In order to make timely releases, we will typically not hold the
>>>>>>>>>>> release unless the bug in question is a regression from 2.2.0. That 
>>>>>>>>>>> being
>>>>>>>>>>> said, if there is something which is a regression from 2.2.0 and 
>>>>>>>>>>> has not
>>>>>>>>>>> been correctly targeted please ping me or a committer to help 
>>>>>>>>>>> target the
>>>>>>>>>>> issue (you can see the open issues listed as impacting Spark 2.3.0 
>>>>>>>>>>> at
>>>>>>>>>>> https://s.apache.org/WmoI).
>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> --
>>>>>>> Twitter: https://twitter.com/holdenkarau
>>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>>
>> --
>> Ryan Blue
>> Software Engineer
>> Netflix
>>
>
>

Reply via email to