Hi –
I am trying to use BoneCP (a database connection pooling library) to write data 
from my Spark application to an RDBMS. The database inserts are inside a 
foreachPartition code block. I am getting this exception when the code tries to 
insert data using BoneCP:

java.sql.SQLException: No suitable driver found for 
jdbc:postgresql://hostname:5432/dbname

I tried explicitly loading the Postgres driver on the worker nodes by adding 
the following line inside the foreachPartition code block:

Class.forName("org.postgresql.Driver")

It didn’t help.

Has anybody able to get a database connection pool library to work with Spark? 
If you got it working, can you please share the steps?

Thanks,
Mohammed

Reply via email to