Hi Mohammed,

Did you use --jars to specify your jdbc driver when you submitted your job?
Take a look of this link:
http://spark.apache.org/docs/1.2.0/submitting-applications.html

Hope this help!

Kelvin

On Thu, Feb 19, 2015 at 7:24 PM, Mohammed Guller <moham...@glassbeam.com>
wrote:

>  Hi –
>
> I am trying to use BoneCP (a database connection pooling library) to write
> data from my Spark application to an RDBMS. The database inserts are inside
> a foreachPartition code block. I am getting this exception when the code
> tries to insert data using BoneCP:
>
>
>
> java.sql.SQLException: No suitable driver found for
> jdbc:postgresql://hostname:5432/dbname
>
>
>
> I tried explicitly loading the Postgres driver on the worker nodes by
> adding the following line inside the foreachPartition code block:
>
>
>
> Class.forName("org.postgresql.Driver")
>
>
>
> It didn’t help.
>
>
>
> Has anybody able to get a database connection pool library to work with
> Spark? If you got it working, can you please share the steps?
>
>
>
> Thanks,
>
> Mohammed
>
>
>

Reply via email to