This is maybe a blocker. See my suggested action about voting on the
current artifact to, I believe, eliminate the possible blocking part of the
issue in the short term.

On Tue, Apr 4, 2017, 22:02 Mridul Muralidharan <mri...@gmail.com> wrote:

> Hi,
>
>
> https://issues.apache.org/jira/browse/SPARK-20202?jql=priority%20%3D%20Blocker%20AND%20affectedVersion%20%3D%20%222.1.1%22%20and%20project%3D%22spark%22
>
>
> Indicates there is another blocker (SPARK-20197 should have come in
> the list too, but was marked major).
>
>
> Regards,
> Mridul
>
> On Tue, Apr 4, 2017 at 11:35 AM, Michael Armbrust
> <mich...@databricks.com> wrote:
> > Thanks for the comments everyone.  This vote fails.  Here's how I think
> we
> > should proceed:
> >  - [SPARK-20197] - SparkR CRAN - appears to be resolved
> >  - [SPARK-XXXX] - Python packaging - Holden, please file a JIRA and
> report
> > if this is a regression and if there is an easy fix that we should wait
> for.
> >
> > For all the other test failures, please take the time to look through
> JIRA
> > and open an issue if one does not already exist so that we can triage if
> > these are just environmental issues.  If I don't hear any objections I'm
> > going to go ahead with RC3 tomorrow.
> >
> > On Sun, Apr 2, 2017 at 1:16 PM, Felix Cheung <felixcheun...@hotmail.com>
> > wrote:
> >>
> >> -1
> >> sorry, found an issue with SparkR CRAN check.
> >> Opened SPARK-20197 and working on fix.
> >>
> >> ________________________________
> >> From: holden.ka...@gmail.com <holden.ka...@gmail.com> on behalf of
> Holden
> >> Karau <hol...@pigscanfly.ca>
> >> Sent: Friday, March 31, 2017 6:25:20 PM
> >> To: Xiao Li
> >> Cc: Michael Armbrust; dev@spark.apache.org
> >> Subject: Re: [VOTE] Apache Spark 2.1.1 (RC2)
> >>
> >> -1 (non-binding)
> >>
> >> Python packaging doesn't seem to have quite worked out (looking at
> >> PKG-INFO the description is "Description: !!!!! missing pandoc do not
> upload
> >> to PyPI !!!!"), ideally it would be nice to have this as a version we
> >> upgrade to PyPi.
> >> Building this on my own machine results in a longer description.
> >>
> >> My guess is that whichever machine was used to package this is missing
> the
> >> pandoc executable (or possibly pypandoc library).
> >>
> >> On Fri, Mar 31, 2017 at 3:40 PM, Xiao Li <gatorsm...@gmail.com> wrote:
> >>>
> >>> +1
> >>>
> >>> Xiao
> >>>
> >>> 2017-03-30 16:09 GMT-07:00 Michael Armbrust <mich...@databricks.com>:
> >>>>
> >>>> Please vote on releasing the following candidate as Apache Spark
> version
> >>>> 2.1.0. The vote is open until Sun, April 2nd, 2018 at 16:30 PST and
> passes
> >>>> if a majority of at least 3 +1 PMC votes are cast.
> >>>>
> >>>> [ ] +1 Release this package as Apache Spark 2.1.1
> >>>> [ ] -1 Do not release this package because ...
> >>>>
> >>>>
> >>>> To learn more about Apache Spark, please see http://spark.apache.org/
> >>>>
> >>>> The tag to be voted on is v2.1.1-rc2
> >>>> (02b165dcc2ee5245d1293a375a31660c9d4e1fa6)
> >>>>
> >>>> List of JIRA tickets resolved can be found with this filter.
> >>>>
> >>>> The release files, including signatures, digests, etc. can be found
> at:
> >>>> http://home.apache.org/~pwendell/spark-releases/spark-2.1.1-rc2-bin/
> >>>>
> >>>> Release artifacts are signed with the following key:
> >>>> https://people.apache.org/keys/committer/pwendell.asc
> >>>>
> >>>> The staging repository for this release can be found at:
> >>>>
> https://repository.apache.org/content/repositories/orgapachespark-1227/
> >>>>
> >>>> The documentation corresponding to this release can be found at:
> >>>>
> http://people.apache.org/~pwendell/spark-releases/spark-2.1.1-rc2-docs/
> >>>>
> >>>>
> >>>> FAQ
> >>>>
> >>>> How can I help test this release?
> >>>>
> >>>> If you are a Spark user, you can help us test this release by taking
> an
> >>>> existing Spark workload and running on this release candidate, then
> >>>> reporting any regressions.
> >>>>
> >>>> What should happen to JIRA tickets still targeting 2.1.1?
> >>>>
> >>>> Committers should look at those and triage. Extremely important bug
> >>>> fixes, documentation, and API tweaks that impact compatibility should
> be
> >>>> worked on immediately. Everything else please retarget to 2.1.2 or
> 2.2.0.
> >>>>
> >>>> But my bug isn't fixed!??!
> >>>>
> >>>> In order to make timely releases, we will typically not hold the
> release
> >>>> unless the bug in question is a regression from 2.1.0.
> >>>>
> >>>> What happened to RC1?
> >>>>
> >>>> There were issues with the release packaging and as a result was
> >>>> skipped.
> >>>
> >>>
> >>
> >>
> >>
> >> --
> >> Cell : 425-233-8271
> >> Twitter: https://twitter.com/holdenkarau
> >
> >
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>
>

Reply via email to