Minor question, but when would be the right time to update the default
Spark version
<https://github.com/apache/spark/blob/76386e1a23c55a58c0aeea67820aab2bac71b24b/ec2/spark_ec2.py#L42>
in the EC2 script?

On Mon, Nov 3, 2014 at 3:55 AM, Patrick Wendell <pwend...@gmail.com> wrote:

> Hi All,
>
> I've just cut the release branch for Spark 1.2, consistent with then
> end of the scheduled feature window for the release. New commits to
> master will need to be explicitly merged into branch-1.2 in order to
> be in the release.
>
> This begins the transition into a QA period for Spark 1.2, with a
> focus on testing and fixes. A few smaller features may still go in as
> folks wrap up loose ends in the next 48 hours (or for developments in
> alpha components).
>
> To help with QA, I'll try to package up a SNAPSHOT release soon for
> community testing; this worked well when testing Spark 1.1 before
> official votes started. I might give it a few days to allow committers
> to merge in back-logged fixes and other patches that were punted to
> after the feature freeze.
>
> Thanks to everyone who helped author and review patches over the last few
> weeks!
>
> - Patrick
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
> For additional commands, e-mail: dev-h...@spark.apache.org
>
>

Reply via email to