Thank you Sean. Happy Diwali !!
-- Dilip
- Original message -From: Xiao Li To: "u...@spark.apache.org" , user Cc:Subject: Happy Diwali everyone!!!Date: Wed, Nov 7, 2018 3:10 PM
Happy Diwali everyone!!!
Xiao Li
-
To
Happy Diwali everyone!!!
Xiao Li
It's not making 2.12 the default, but not dropping 2.11. Supporting
2.13 could mean supporting 3 Scala versions at once, which I claim is
just too much. I think the options are likely:
- Support 2.11, 2.12 in Spark 3.0. Deprecate 2.11 and make 2.12 the
default. Add 2.13 support in 3.x and drop 2.1
Ok, got it -- it's really just an argument for not all of 2.11, 2.12 and
2.13 at the same time; always 2.12; now figure out when we stop 2.11
support and start 2.13 support.
On Wed, Nov 7, 2018 at 11:10 AM Sean Owen wrote:
> It's not making 2.12 the default, but not dropping 2.11. Supporting
> 2
I'm not following "exclude Scala 2.13". Is there something inherent in
making 2.12 the default Scala version in Spark 3.0 that would prevent us
from supporting the option of building with 2.13?
On Tue, Nov 6, 2018 at 5:48 PM Sean Owen wrote:
> That's possible here, sure. The issue is: would you
+1 seems reasonable at this point.
Tom
On Tuesday, November 6, 2018, 1:24:16 PM CST, DB Tsai
wrote:
Given Oracle's new 6-month release model, I feel the only realistic option is
to only test and support JDK such as JDK 11 LTS and future LTS release. I would
like to have a discussion o
Agree with the points Felix made.
One thing is that it looks like the only problem is vignettes and the
tests are being skipped as designed. If you see
https://win-builder.r-project.org/incoming_pretest/SparkR_2.4.0_20181105_165757/Windows/00check.log
and
https://win-builder.r-project.org/incomin
Use the Fix Version instead. Target Version is only used occasionally
to mark that a JIRA is intended for a release. It isn't set on most of
them that are rapidly created and resolved.
There is some explanation of the few Resolution statuses that are used
consistently, in http://spark.apache.org/c
I spoke with the Scala team at Lightbend. They plan to do a 2.13-RC1
release in January and GA a few months later. Of course, nothing is ever
certain. What's the thinking for the Spark 3.0 timeline? If it's likely to
be late Q1 or in Q2, then it might make sense to add Scala 2.13 as an
alternative
Red Hat:
https://access.redhat.com/articles/1299013#OpenJDK_Lifecycle_Dates_and_RHEL_versions
Stavros
On Wed, Nov 7, 2018 at 12:13 PM, Kazuaki Ishizaki
wrote:
> This entry includes a good figure for support lifecycle.
> https://www.azul.com/products/zulu-and-zulu-enterprise/zulu-
> enterprise-j
This entry includes a good figure for support lifecycle.
https://www.azul.com/products/zulu-and-zulu-enterprise/zulu-enterprise-java-support-options/
Kazuaki Ishizaki,
From: Marcelo Vanzin
To: Felix Cheung
Cc: Ryan Blue , sn...@snazy.de, dev
, Cesar Delgado
Date: 2018/11/07 08:2
Hi,
I've been trying to find out the issues that are part of 2.4.0 and used the
following query:
project = SPARK AND resolution in (Resolved, Done, Fixed) and "Target
Version/s" = "2.4.0"
I got 202 issues. Is that correct? What's the difference between the
Resolution statuses: Resolved, Done, Fi
12 matches
Mail list logo