If memory serves me correctly in 1.3.1 at least there was a problem with
when the driver was added -- the right classloader wasn't picking it up.
You can try searching the archives, but the issue is similar to these
threads:
http://stackoverflow.com/questions/30940566/connecting-from-spark-pyspark-
So, I need to connect to multiple databases to do cool stuff with Spark. To
do this, I need multiple database drivers: Postgres + MySQL.
*Problem*: Spark fails to run both drivers
This method works for one driver at a time:
spark-submit --driver-class-path="/driver.jar"
These methods do no