If it's easy enough to produce them, I agree you can just add them to the
RC dir.

On Thu, Jun 28, 2018 at 11:56 AM Marcelo Vanzin <van...@cloudera.com.invalid>
wrote:

> I just noticed this RC is missing builds for hadoop 2.3 and 2.4, which
> existed in the previous version:
> https://dist.apache.org/repos/dist/release/spark/spark-2.1.2/
>
> How important do we think are those? I think I can just build them and
> publish them to the RC directory without having to create a new RC.
>
> On Tue, Jun 26, 2018 at 1:25 PM, Marcelo Vanzin <van...@cloudera.com>
> wrote:
> > Please vote on releasing the following candidate as Apache Spark version
> 2.1.3.
> >
> > The vote is open until Fri, June 29th @ 9PM UTC (2PM PDT) and passes if a
> > majority +1 PMC votes are cast, with a minimum of 3 +1 votes.
> >
> > [ ] +1 Release this package as Apache Spark 2.1.3
> > [ ] -1 Do not release this package because ...
> >
> > To learn more about Apache Spark, please see http://spark.apache.org/
> >
> > The tag to be voted on is v2.1.3-rc2 (commit b7eac07b):
> > https://github.com/apache/spark/tree/v2.1.3-rc2
> >
> > The release files, including signatures, digests, etc. can be found at:
> > https://dist.apache.org/repos/dist/dev/spark/v2.1.3-rc2-bin/
> >
> > Signatures used for Spark RCs can be found in this file:
> > https://dist.apache.org/repos/dist/dev/spark/KEYS
> >
> > The staging repository for this release can be found at:
> > https://repository.apache.org/content/repositories/orgapachespark-1275/
> >
> > The documentation corresponding to this release can be found at:
> > https://dist.apache.org/repos/dist/dev/spark/v2.1.3-rc2-docs/
> >
> > The list of bug fixes going into 2.1.3 can be found at the following URL:
> > https://issues.apache.org/jira/projects/SPARK/versions/12341660
> >
> > Notes:
> >
> > - RC1 was not sent for a vote. I had trouble building it, and by the
> time I got
> >   things fixed, there was a blocker bug filed. It was already tagged in
> git
> >   at that time.
> >
> > - If testing the source package, I recommend using Java 8, even though
> 2.1
> >   supports Java 7 (and the RC was built with JDK 7). This is because
> Maven
> >   Central has updated some configuration that makes the default Java 7
> SSL
> >   config not work.
> >
> > - There are Maven artifacts published for Scala 2.10, but binary
> > releases are only
> >   available for Scala 2.11. This matches the previous release (2.1.2),
> > but if there's
> >   a need / desire to have pre-built distributions for Scala 2.10, I can
> probably
> >   amend the RC without having to create a new one.
> >
> > FAQ
> >
> > =========================
> > How can I help test this release?
> > =========================
> >
> > If you are a Spark user, you can help us test this release by taking
> > an existing Spark workload and running on this release candidate, then
> > reporting any regressions.
> >
> > If you're working in PySpark you can set up a virtual env and install
> > the current RC and see if anything important breaks, in the Java/Scala
> > you can add the staging repository to your projects resolvers and test
> > with the RC (make sure to clean up the artifact cache before/after so
> > you don't end up building with a out of date RC going forward).
> >
> > ===========================================
> > What should happen to JIRA tickets still targeting 2.1.3?
> > ===========================================
> >
> > The current list of open tickets targeted at 2.1.3 can be found at:
> > https://s.apache.org/spark-2.1.3
> >
> > Committers should look at those and triage. Extremely important bug
> > fixes, documentation, and API tweaks that impact compatibility should
> > be worked on immediately. Everything else please retarget to an
> > appropriate release.
> >
> > ==================
> > But my bug isn't fixed?
> > ==================
> >
> > In order to make timely releases, we will typically not hold the
> > release unless the bug in question is a regression from the previous
> > release. That being said, if there is something which is a regression
> > that has not been correctly targeted please ping me or a committer to
> > help target the issue.
> >
> >
> > --
> > Marcelo
>
>
>
> --
> Marcelo
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>
>

Reply via email to