Thanks for sharing! A very interesting reading indeed.

Pozdrawiam,
Jacek Laskowski
----
https://medium.com/@jaceklaskowski/
Mastering Apache Spark 2.0 https://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski


On Fri, Jan 20, 2017 at 10:17 PM, Morten Hornbech <mor...@datasolvr.com> wrote:
> Sure :-)
>
> Digging into the TargetInvocationException revealed a “NoSuchFieldError:
> DEFAULT_MAX_PENDING_TASKS", which we guessed was linked to some kind of
> binary incompatibility in the dependencies. Looking into the stack trace
> this could be traced to a dynamic constructor call in netty, and we could
> see that Sparks netty-all dependency had been lifted from 4.0.29 to 4.0.42.
> At the same time we have a dependency on netty-transport 4.0.37 due to our
> use of the Cassandra Java Driver, and those classes are also in netty-all.
> Looking into the versions we identified that these were indeed not binary
> compatible - a field had been removed as the original error said. Since the
> artifacts are different they can co-exist on the classpath, but of course
> not at runtime, and we then get the error when incompatible versions are
> loaded. Interestingly we could not reproduce this on our laptops - only on
> our buildserver - so there must be some random difference in classloader
> behaviour.
>
> Dependency issues is the typical problem we have when taking in a new
> version of Spark and they always take some time to find out. We had a lot of
> issues with Guava when taking in 2.0 and ended up shading it when no other
> option was available. It would be really nice if the version bumps were
> included in the release notes.
>
> Morten
>
>
> Den 20. jan. 2017 kl. 20.21 skrev Jacek Laskowski <ja...@japila.pl>:
>
> Hi,
>
> I'd be very interested in how you figured it out. Mind sharing?
>
> Jacek
>
> On 18 Jan 2017 9:51 p.m., "mhornbech" <mor...@datasolvr.com> wrote:
>>
>> For anyone revisiting this at a later point, the issue was that Spark
>> 2.1.0
>> upgrades netty to version 4.0.42 which is not binary compatible with
>> version
>> 4.0.37 used by version 3.1.0 of the Cassandra Java Driver. The newer
>> version
>> can work with Cassandra, but because of differences in the maven artifacts
>> (Spark depends on netty-all while Cassandra depends on netty-transport)
>> this
>> was not automatically resolved by SBT. Adding an explicit dependency to
>> netty-transport version 4.0.42 solved the problem.
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/New-runtime-exception-after-switch-to-Spark-2-1-0-tp28263p28319.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>> ---------------------------------------------------------------------
>> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>>
>

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to