Hi,

Many thanks for the suggestions.
I discovered that the problem was related on a missing driver definition in the 
jdbc options map.
The error wasn’t really helpful to understand that!

Cheers,
Mirko

On 22 Jun 2016, at 18:11, markcitizen [via Apache Spark User List] 
<ml-node+s1001560n27211...@n3.nabble.com<mailto:ml-node+s1001560n27211...@n3.nabble.com>>
 wrote:

Hello,
I can't help you with your particular problem but usually errors like the one 
you're seeing are caused by class version incompatibility.
I've recently spent a lot of time researching a problem similar to yours with 
Spark 1.6.1 (Scala).
For us the collision was related to class version in 
org.jboss.netty.handler.ssl package.

I think there are three ways to solve this:
1) Check jar versions for jars deployed as part of Spark runtime and use the 
same versions in your code
2) Update your Spark runtime libs (if you can) with versions of jars that work 
(that can be tricky)
3) The solution we used was to add shading configuration (in Build.scala) for 
packages that were colliding:

val shadingRules = Seq(ShadeRule.rename("org.jboss.netty.handler.ssl.**" -> 
"shadeit.@1").inAll)

assemblyShadeRules in assembly := shadingRules

That last option worked best because it allowed us to control version collision 
for multiple packages/classes.
I don't know if you're using Scala or Java but maybe this gives you some ideas 
on how to proceed.
Fixing class version collision can be a messy ordeal, you'll need to move 
jars/versions around until it works. It looks like Maven also has a shading 
plugin so if you're using Java maybe you can try that:
https://maven.apache.org/plugins/maven-shade-plugin/

Best,

M


________________________________
If you reply to this email, your message will be added to the discussion below:
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-SQL-NoSuchMethodException-DriverWrapper-init-tp27171p27211.html
To unsubscribe from Spark SQL NoSuchMethodException...DriverWrapper.<init>(), 
click 
here<http://apache-spark-user-list.1001560.n3.nabble.com/template/NamlServlet.jtp?macro=unsubscribe_by_code&node=27171&code=bWlya28uYmVybmFyZG9uaUBpeHh1cy5jb218MjcxNzF8LTEwMzI1MDE1MTI=>.
NAML<http://apache-spark-user-list.1001560.n3.nabble.com/template/NamlServlet.jtp?macro=macro_viewer&id=instant_html%21nabble%3Aemail.naml&base=nabble.naml.namespaces.BasicNamespace-nabble.view.web.template.NabbleNamespace-nabble.view.web.template.NodeNamespace&breadcrumbs=notify_subscribers%21nabble%3Aemail.naml-instant_emails%21nabble%3Aemail.naml-send_instant_email%21nabble%3Aemail.naml>





--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-SQL-NoSuchMethodException-DriverWrapper-init-tp27171p27223.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Reply via email to