SparkR issue with array types in gapply()

2016-10-25 Thread shirisht
Hello, I am getting an exception from catalyst when array types are used in the return schema of gapply() function. Following is a (made-up) example: iris$flag = base::sample(1:2, nrow(iris), T, prob = c(0.5,0.5)) irisdf = createDataF

Spark deployed as a Snap package

2016-10-25 Thread Michael Hall
I’m Michael Hall. I work at Canonical as part of the engineering team around Ubuntu and Snapcraft [1]. We’re working on snaps, a platform to enable ISVs to directly control delivery of software updates to their users, and make their software available to a considerably wider audience. There has

Re: Straw poll: dropping support for things like Scala 2.10

2016-10-25 Thread Nicholas Chammas
Agreed. Would an announcement/reminder on the dev and user lists suffice in this case? Basically, just point out what's already been mentioned in the 2.0 release notes, and include a link there so people know what we're referencing. 2016년 10월 25일 (화) 오후 5:32, Mark Hamstra 님이 작성: > You're right; so

Re: Straw poll: dropping support for things like Scala 2.10

2016-10-25 Thread Mark Hamstra
You're right; so we could remove Java 7 support in 2.1.0. Both Holden and I not having the facts immediately to mind does suggest, however, that we should be doing a better job of making sure that information about deprecated language versions is inescapably public. That's harder to do with a lang

Re: Straw poll: dropping support for things like Scala 2.10

2016-10-25 Thread Nicholas Chammas
No, I think our intent is that using a deprecated language version can generate warnings, but that it should still work; whereas once we remove support for a language version, then it really is ok for Spark developers to do things not compatible with that version and for users attempting to use tha

Re: Straw poll: dropping support for things like Scala 2.10

2016-10-25 Thread Mark Hamstra
No, I think our intent is that using a deprecated language version can generate warnings, but that it should still work; whereas once we remove support for a language version, then it really is ok for Spark developers to do things not compatible with that version and for users attempting to use tha

Re: Straw poll: dropping support for things like Scala 2.10

2016-10-25 Thread Nicholas Chammas
FYI: Support for both Python 2.6 and Java 7 was deprecated in 2.0 (see release notes under Deprecations). The deprecation notice didn't offer a specific timeline for completely dropping support other than to say they "might be removed in f

getting encoder implicits to be more accurate

2016-10-25 Thread Koert Kuipers
i am trying to use encoders as a typeclass where if it fails to find an ExpressionEncoder it falls back to KryoEncoder. the issue seems to be that ExpressionEncoder claims a little more than it can handle here: implicit def newProductEncoder[T <: Product : TypeTag]: Encoder[T] = Encoders.product

Re: Straw poll: dropping support for things like Scala 2.10

2016-10-25 Thread Koert Kuipers
it will take time before all libraries that spark depends on are available for scala 2.12, so we are not talking spark 2.1.x and probably also not 2.2.x for scala 2.12 it technically makes sense to drop java 7 and scala 2.10 around the same time as scala 2.12 is introduced we are still heavily de

Re: Straw poll: dropping support for things like Scala 2.10

2016-10-25 Thread Daniel Siegmann
After support is dropped for Java 7, can we have encoders for java.time classes (e.g. LocalDate)? If so, then please drop support for Java 7 ASAP. :-)

Re: Straw poll: dropping support for things like Scala 2.10

2016-10-25 Thread Cody Koeninger
I think only supporting 1 version of scala at any given time is not sufficient, 2 probably is ok. I.e. don't drop 2.10 before 2.12 is out + supported On Tue, Oct 25, 2016 at 10:56 AM, Sean Owen wrote: > The general forces are that new versions of things to support emerge, and > are valuable to s

Re: Straw poll: dropping support for things like Scala 2.10

2016-10-25 Thread Ofir Manor
I think that 2.1 should include a visible deprecation message about Java 7, Scala 2.10 and older Hadoop versions (plus python if there is a consensus on that), to give users / admins early warning, followed by dropping them from trunk for 2.2 once 2.1 is released. Personally, we use only Scala 2.11

Re: Straw poll: dropping support for things like Scala 2.10

2016-10-25 Thread Sean Owen
The general forces are that new versions of things to support emerge, and are valuable to support, but have some cost to support in addition to old versions. And the old versions become less used and therefore less valuable to support, and at some point it tips to being more cost than value. It's h

Re: Straw poll: dropping support for things like Scala 2.10

2016-10-25 Thread Holden Karau
I'd also like to add Python 2.6 to the list of things. We've considered dropping it before but never followed through to the best of my knowledge (although on mobile right now so can't double check). On Tuesday, October 25, 2016, Sean Owen wrote: > I'd like to gauge where people stand on the iss

Re: Straw poll: dropping support for things like Scala 2.10

2016-10-25 Thread Mark Hamstra
What's changed since the last time we discussed these issues, about 7 months ago? Or, another way to formulate the question: What are the threshold criteria that we should use to decide when to end Scala 2.10 and/or Java 7 support? On Tue, Oct 25, 2016 at 8:36 AM, Sean Owen wrote: > I'd like to

Straw poll: dropping support for things like Scala 2.10

2016-10-25 Thread Sean Owen
I'd like to gauge where people stand on the issue of dropping support for a few things that were considered for 2.0. First: Scala 2.10. We've seen a number of build breakages this week because the PR builder only tests 2.11. No big deal at this stage, but, it did cause me to wonder whether it's ti

Converting spark types and standard scala types

2016-10-25 Thread assaf.mendelson
Hi, I am trying to write a new aggregate function (https://issues.apache.org/jira/browse/SPARK-17691) and I wanted it to support all ordered types. I have several issues though: 1. How to convert the type of the child expression to a Scala standard type (e.g. I need an Array[Int] for Int

PSA: watch the apache/spark-website repo if interested

2016-10-25 Thread Sean Owen
I don't believe emails about the spark-website repo are forwarded to the project mailing lists. If you want to watch for them, go star/watch the repo to be sure. I just opened a PR, for example. https://github.com/apache/spark-website