Hi Mirko,

What exactly was the setting? I'd like to reproduce it. Can you file
an issue in JIRA to fix that?

Pozdrawiam,
Jacek Laskowski
----
https://medium.com/@jaceklaskowski/
Mastering Apache Spark http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski


On Fri, Jun 24, 2016 at 10:54 AM, Mirko <mirko.bernard...@ixxus.com> wrote:
> Hi,
>
> Many thanks for the suggestions.
> I discovered that the problem was related on a missing driver definition in
> the jdbc options map.
> The error wasn’t really helpful to understand that!
>
> Cheers,
> Mirko
>
> On 22 Jun 2016, at 18:11, markcitizen [via Apache Spark User List] <[hidden
> email]> wrote:
>
> Hello,
> I can't help you with your particular problem but usually errors like the
> one you're seeing are caused by class version incompatibility.
> I've recently spent a lot of time researching a problem similar to yours
> with Spark 1.6.1 (Scala).
> For us the collision was related to class version in
> org.jboss.netty.handler.ssl package.
>
> I think there are three ways to solve this:
> 1) Check jar versions for jars deployed as part of Spark runtime and use the
> same versions in your code
> 2) Update your Spark runtime libs (if you can) with versions of jars that
> work (that can be tricky)
> 3) The solution we used was to add shading configuration (in Build.scala)
> for packages that were colliding:
>
> val shadingRules = Seq(ShadeRule.rename("org.jboss.netty.handler.ssl.**" ->
> "shadeit.@1").inAll)
>
> assemblyShadeRules in assembly := shadingRules
>
> That last option worked best because it allowed us to control version
> collision for multiple packages/classes.
> I don't know if you're using Scala or Java but maybe this gives you some
> ideas on how to proceed.
> Fixing class version collision can be a messy ordeal, you'll need to move
> jars/versions around until it works. It looks like Maven also has a shading
> plugin so if you're using Java maybe you can try that:
> https://maven.apache.org/plugins/maven-shade-plugin/
>
> Best,
>
> M
>
>
> ________________________________
> If you reply to this email, your message will be added to the discussion
> below:
> http://apache-spark-user-list.1001560.n3.nabble.com/Spark-SQL-NoSuchMethodException-DriverWrapper-init-tp27171p27211.html
> To unsubscribe from Spark SQL
> NoSuchMethodException...DriverWrapper.<init>(), click here.
> NAML
>
>
>
> ________________________________
> View this message in context: Re: Spark SQL
> NoSuchMethodException...DriverWrapper.<init>()
>
> Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to