Yes, some organization do lag behind the current release by sometimes a
significant amount. That is a bug, not a feature -- and one that increases
pressure toward fragmentation of the Spark community. To date, that hasn't
been a significant problem, and I think that is mainly because the factors
On Wed, Apr 6, 2016 at 2:57 PM, Mark Hamstra
wrote:
... My concern is that either of those options will take more resources
> than some Spark users will have available in the ~3 months remaining before
> Spark 2.0.0, which will cause fragmentation into Spark 1.x and Spark 2.x
> user communities.
I agree with your general logic and understanding of semver. That is why
if we are going to violate the strictures of semver, I'd only be happy
doing so if support for Java 7 and/or Scala 2.10 were clearly understood to
be deprecated already in the 2.0.0 release -- i.e. from the outset not to
be u
;>> once we get there.
>>>
>>> On Fri, Apr 1, 2016 at 10:00 PM, Raymond Honderdors <
>>> raymond.honderd...@sizmek.com> wrote:
>>>
>>>> What about a seperate branch for scala 2.10?
>>>>
>>>>
>>>>
>>>&g
In general, I agree - it is preferable to break backward compatibility
(where unavoidable) only at major versions.
Unfortunately, this usually is planned better - with earlier versions
announcing intent of the change - deprecation across multiple
releases, defaults changed, etc.
>From the thread,
Answering for myself: I assume everyone is following
http://semver.org/ semantic versioning. If not, would be good to hear
an alternative theory.
For semver, strictly speaking, minor releases should be
backwards-compatible for callers. Are things like stopping support for
Java 8 or Scala 2.10 back
gt;>
>>> Sent from my Samsung Galaxy smartphone.
>>>
>>>
>>> Original message ----
>>> From: Koert Kuipers
>>> Date: 4/2/2016 02:10 (GMT+02:00)
>>> To: Michael Armbrust
>>> Cc: Matei Zaharia , Mark Hamstra
t;> Cc: Matei Zaharia , Mark Hamstra <
>> m...@clearstorydata.com>, Cody Koeninger , Sean Owen
>> , dev@spark.apache.org
>> Subject: Re: Discuss: commit to Scala 2.10 support for Spark 2.x
>> lifecycle
>>
>> as long as we don't lock ourselves into s
From: Koert Kuipers
> Date: 4/2/2016 02:10 (GMT+02:00)
> To: Michael Armbrust
> Cc: Matei Zaharia , Mark Hamstra <
> m...@clearstorydata.com>, Cody Koeninger , Sean Owen <
> so...@cloudera.com>, dev@spark.apache.org
> Subject: Re: Discuss: commit to Scala 2.10 su
: Re: Discuss: commit to Scala 2.10 support for Spark 2.x lifecycle
as long as we don't lock ourselves into supporting scala 2.10 for the entire
spark 2 lifespan it sounds reasonable to me
On Wed, Mar 30, 2016 at 3:25 PM, Michael Armbrust
mailto:mich...@databricks.com>> wrote:
+1
as long as we don't lock ourselves into supporting scala 2.10 for the
entire spark 2 lifespan it sounds reasonable to me
On Wed, Mar 30, 2016 at 3:25 PM, Michael Armbrust
wrote:
> +1 to Matei's reasoning.
>
> On Wed, Mar 30, 2016 at 9:21 AM, Matei Zaharia
> wrote:
>
>> I agree that putting it i
+1 to Matei's reasoning.
On Wed, Mar 30, 2016 at 9:21 AM, Matei Zaharia
wrote:
> I agree that putting it in 2.0 doesn't mean keeping Scala 2.10 for the
> entire 2.x line. My vote is to keep Scala 2.10 in Spark 2.0, because it's
> the default version we built with in 1.x. We want to make the tran
I agree that putting it in 2.0 doesn't mean keeping Scala 2.10 for the entire
2.x line. My vote is to keep Scala 2.10 in Spark 2.0, because it's the default
version we built with in 1.x. We want to make the transition from 1.x to 2.0 as
easy as possible. In 2.0, we'll have the default downloads
oh wow, had no idea it got ripped out
On Wed, Mar 30, 2016 at 11:50 AM, Mark Hamstra
wrote:
> No, with 2.0 Spark really doesn't use Akka:
> https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/SparkConf.scala#L744
>
> On Wed, Mar 30, 2016 at 9:10 AM, Koert Kuipers wr
My concern is that for some of those stuck using 2.10 because of some
library dependency, three months isn't sufficient time to refactor their
infrastructure to be compatible with Spark 2.0.0 if that requires Scala
2.11. The additional 3-6 months would make it much more feasible for those
users to
No, with 2.0 Spark really doesn't use Akka:
https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/SparkConf.scala#L744
On Wed, Mar 30, 2016 at 9:10 AM, Koert Kuipers wrote:
> Spark still runs on akka. So if you want the benefits of the latest akka
> (not saying we do,
Spark still runs on akka. So if you want the benefits of the latest akka
(not saying we do, was just an example) then you need to drop scala 2.10
On Mar 30, 2016 10:44 AM, "Cody Koeninger" wrote:
> I agree with Mark in that I don't see how supporting scala 2.10 for
> spark 2.0 implies supporting
Yeah it is not crazy to drop support for something foundational like this
in a feature release but is something ideally coupled to a major release.
You could at least say it is probably a decision to keep supporting through
the end of the year given how releases are likely to go. Given the
availabi
I agree with Mark in that I don't see how supporting scala 2.10 for
spark 2.0 implies supporting it for all of spark 2.x
Regarding Koert's comment on akka, I thought all akka dependencies
have been removed from spark after SPARK-7997 and the recent removal
of external/akka
On Wed, Mar 30, 2016 at
Dropping Scala 2.10 support has to happen at some point, so I'm not
fundamentally opposed to the idea; but I've got questions about how we go
about making the change and what degree of negative consequences we are
willing to accept. Until now, we have been saying that 2.10 support will
be continue
about that pro, i think it's more the opposite: many libraries have
stopped maintaining scala 2.10 versions. bugs will no longer be fixed for
scala 2.10 and new libraries will not be available for scala 2.10 at all,
making them unusable in spark.
take for example akka, a distributed messaging l
21 matches
Mail list logo