So, I need to connect to multiple databases to do cool stuff with Spark. To
do this, I need multiple database drivers: Postgres + MySQL.

*Problem*: Spark fails to run both drivers

This method works for one driver at a time:

spark-submit **** --driver-class-path="/driver.jar"

These methods do not work for one driver, or many (though Spark does say
Added "driver.jar" with timestamp *** in the log):

   - spark-submit --jars "driver1.jar, driver2.jar"
   - sparkContext.addJar("driver.jar")
   - echo 'spark.driver.extraClassPath="driver.jar"' >> spark-defaults.conf
   - echo 'spark.executor.extraClassPath="driver.jar"' >>
   spark-defaults.conf
   - sbt assembly (fat jar with drivers)

*Example error:*

Exception in thread "main" java.sql.SQLException: No suitable driver found
for jdbc:mysql://**** at
com.mysql.jdbc.SQLError.createSQLException(SQLError.java:1055) at
com.mysql.jdbc.SQLError.createSQLException(SQLError.java:956) at
com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3491) at
com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3423) at
com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:910) at
com.mysql.jdbc.MysqlIO.secureAuth411(MysqlIO.java:3923) at
com.mysql.jdbc.MysqlIO.doHandshake(MysqlIO.java:1273)

*Versions Tested*: Spark 1.3.1 && 1.4.1

What method can I use to load both drivers?
Thanks,

Nicholas Connor

Reply via email to