Can you create a pull request? It is difficult to know what's going on.
On Mon, Feb 8, 2016 at 4:51 PM, sim wrote:
> 24 test failures for sql/test:
> https://gist.github.com/ssimeonov/89862967f87c5c497322
>
>
>
> --
> View this message in context:
> http://apache-spark-developers-list.1001551.n
24 test failures for sql/test:
https://gist.github.com/ssimeonov/89862967f87c5c497322
--
View this message in context:
http://apache-spark-developers-list.1001551.n3.nabble.com/Scala-API-simplifying-common-patterns-tp16238p16247.html
Sent from the Apache Spark Developers List mailing list archi
Yea I'm not sure what's going on either. You can just run the unit tests
through "build/sbt sql/test" without running mima.
On Mon, Feb 8, 2016 at 3:47 PM, sim wrote:
> Same result with both caches cleared.
>
>
>
> --
> View this message in context:
> http://apache-spark-developers-list.1001551
Same result with both caches cleared.
--
View this message in context:
http://apache-spark-developers-list.1001551.n3.nabble.com/Scala-API-simplifying-common-patterns-tp16238p16244.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.
-
Not 100% sure what's going on, but you can try wiping your local ivy2 and
maven cache.
On Mon, Feb 8, 2016 at 12:05 PM, sim wrote:
> Reynold, I just forked + built master and I'm getting lots of binary
> compatibility errors when running the tests.
>
> https://gist.github.com/ssimeonov/69cb0b
Reynold, I just forked + built master and I'm getting lots of binary
compatibility errors when running the tests.
https://gist.github.com/ssimeonov/69cb0b41750be776
Nothing in the dev tools section of the wiki on this. Any advice on how to
get green before I work on the PRs?
Thanks,
Sim
Sure.
--
View this message in context:
http://apache-spark-developers-list.1001551.n3.nabble.com/Scala-API-simplifying-common-patterns-tp16238p16241.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.
-
Both of these make sense to add. Can you submit a pull request?
On Sun, Feb 7, 2016 at 3:29 PM, sim wrote:
> The more Spark code I write, the more I hit the same use cases where the
> Scala APIs feel a bit awkward. I'd love to understand if there are
> historical reasons for these and whether t