+1 on extending the deadline. It will significantly improve the logistics
for upstreaming the Kubernetes back-end.  Also agreed, on the general
realities of reduced bandwidth over the Nov-Dec holiday season.
Erik

On Thu, Nov 9, 2017 at 6:03 PM, Matei Zaharia <matei.zaha...@gmail.com>
wrote:

> I’m also +1 on extending this to get Kubernetes and other features in.
>
> Matei
>
> > On Nov 9, 2017, at 4:04 PM, Anirudh Ramanathan <fox...@google.com.INVALID>
> wrote:
> >
> > This would help the community on the Kubernetes effort quite a bit -
> giving us additional time for reviews and testing for the 2.3 release.
> >
> > On Thu, Nov 9, 2017 at 3:56 PM, Justin Miller <
> justin.mil...@protectwise.com> wrote:
> > That sounds fine to me. I’m hoping that this ticket can make it into
> Spark 2.3: https://issues.apache.org/jira/browse/SPARK-18016
> >
> > It’s causing some pretty considerable problems when we alter the columns
> to be nullable, but we are OK for now without that.
> >
> > Best,
> > Justin
> >
> >> On Nov 9, 2017, at 4:54 PM, Michael Armbrust <mich...@databricks.com>
> wrote:
> >>
> >> According to the timeline posted on the website, we are nearing branch
> cut for Spark 2.3.  I'd like to propose pushing this out towards mid to
> late December for a couple of reasons and would like to hear what people
> think.
> >>
> >> 1. I've done release management during the Thanksgiving / Christmas
> time before and in my experience, we don't actually get a lot of testing
> during this time due to vacations and other commitments. I think beginning
> the RC process in early January would give us the best coverage in the
> shortest amount of time.
> >> 2. There are several large initiatives in progress that given a little
> more time would leave us with a much more exciting 2.3 release.
> Specifically, the work on the history server, Kubernetes and continuous
> processing.
> >> 3. Given the actual release date of Spark 2.2, I think we'll still get
> Spark 2.3 out roughly 6 months after.
> >>
> >> Thoughts?
> >>
> >> Michael
> >
> >
>
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>
>

Reply via email to