Thanks for sharing! A very interesting reading indeed.
Pozdrawiam,
Jacek Laskowski
https://medium.com/@jaceklaskowski/
Mastering Apache Spark 2.0 https://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski
On Fri, Jan 20, 2017 at 10:17 PM, Morten Hornbech wrote:
>
Sure :-)
Digging into the TargetInvocationException revealed a “NoSuchFieldError:
DEFAULT_MAX_PENDING_TASKS", which we guessed was linked to some kind of binary
incompatibility in the dependencies. Looking into the stack trace this could be
traced to a dynamic constructor call in netty, and we
Hi,
I'd be very interested in how you figured it out. Mind sharing?
Jacek
On 18 Jan 2017 9:51 p.m., "mhornbech" wrote:
> For anyone revisiting this at a later point, the issue was that Spark 2.1.0
> upgrades netty to version 4.0.42 which is not binary compatible with
> version
> 4.0.37 used by
For anyone revisiting this at a later point, the issue was that Spark 2.1.0
upgrades netty to version 4.0.42 which is not binary compatible with version
4.0.37 used by version 3.1.0 of the Cassandra Java Driver. The newer version
can work with Cassandra, but because of differences in the maven arti
Hi
We just tested a switch from Spark 2.0.2 to Spark 2.1.0 on our codebase. It
compiles fine, but introduces the following runtime exception upon
initialization of our Cassandra database. I can't find any clues in the
release notes. Has anyone experienced this?
Morten
sbt.ForkMain$ForkError: jav