Usually this sort of thing happens because the two versions are in different namespaces in different major versions and both are needed. That is true of Netty: http://netty.io/wiki/new-and-noteworthy-in-4.0.html However, I see that Spark declares a direct dependency on both, when it does not use 3.x directly (and should not). The exception is in the Flume module, but that could be handled more narrowly. I will look into fixing this if applicable.
On Mon, Oct 10, 2016 at 11:56 AM Paweł Szulc <paul.sz...@gmail.com> wrote: > Hi, > > quick question, why is Spark using two different versions of netty?: > > > - io.netty:netty-all:4.0.29.Final:jar > - io.netty:netty:3.8.0.Final:jar > > > ? > > -- > Regards, > Paul Szulc > > twitter: @rabbitonweb > blog: www.rabbitonweb.com >