Krishna,

Docs don't block the rc voting because docs can be updated in parallel with
release candidates, until the point a release is made.


On Fri, Nov 28, 2014 at 9:55 PM, Krishna Sankar <ksanka...@gmail.com> wrote:

> Looks like the documentation hasn't caught up with the new features.
> On the machine learning side, for example org.apache.spark.ml,
> RandomForest, gbtree and so forth. Is a refresh of the documentation
> planned ?
> Am happy to see these capabilities, but these would need good explanations
> as well, especially the new thinking around the ml ... pipelines,
> transformations et al.
> IMHO, the documentation is a -1.
> Will check out the compilation, mlib et al
>
> Cheers
> <k/>
>
> On Fri, Nov 28, 2014 at 9:16 PM, Patrick Wendell <pwend...@gmail.com>
> wrote:
>
> > Please vote on releasing the following candidate as Apache Spark version
> > 1.2.0!
> >
> > The tag to be voted on is v1.2.0-rc1 (commit 1056e9ec1):
> >
> >
> https://git-wip-us.apache.org/repos/asf?p=spark.git;a=commit;h=1056e9ec13203d0c51564265e94d77a054498fdb
> >
> > The release files, including signatures, digests, etc. can be found at:
> > http://people.apache.org/~pwendell/spark-1.2.0-rc1/
> >
> > Release artifacts are signed with the following key:
> > https://people.apache.org/keys/committer/pwendell.asc
> >
> > The staging repository for this release can be found at:
> > https://repository.apache.org/content/repositories/orgapachespark-1048/
> >
> > The documentation corresponding to this release can be found at:
> > http://people.apache.org/~pwendell/spark-1.2.0-rc1-docs/
> >
> > Please vote on releasing this package as Apache Spark 1.2.0!
> >
> > The vote is open until Tuesday, December 02, at 05:15 UTC and passes
> > if a majority of at least 3 +1 PMC votes are cast.
> >
> > [ ] +1 Release this package as Apache Spark 1.1.0
> > [ ] -1 Do not release this package because ...
> >
> > To learn more about Apache Spark, please see
> > http://spark.apache.org/
> >
> > == What justifies a -1 vote for this release? ==
> > This vote is happening very late into the QA period compared with
> > previous votes, so -1 votes should only occur for significant
> > regressions from 1.0.2. Bugs already present in 1.1.X, minor
> > regressions, or bugs related to new features will not block this
> > release.
> >
> > == What default changes should I be aware of? ==
> > 1. The default value of "spark.shuffle.blockTransferService" has been
> > changed to "netty"
> > --> Old behavior can be restored by switching to "nio"
> >
> > 2. The default value of "spark.shuffle.manager" has been changed to
> "sort".
> > --> Old behavior can be restored by setting "spark.shuffle.manager" to
> > "hash".
> >
> > == Other notes ==
> > Because this vote is occurring over a weekend, I will likely extend
> > the vote if this RC survives until the end of the vote period.
> >
> > - Patrick
> >
> > ---------------------------------------------------------------------
> > To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
> > For additional commands, e-mail: dev-h...@spark.apache.org
> >
> >
>

Reply via email to