Although I don't know if it's related, the Class.forName() method of
loading drivers is very old. You should be using DataSource and
javax.sql; this has been the usual practice since about Java 1.4.

Why do you say a different driver is being loaded? that's not the error here.

Try instantiating the driver directly to test whether it's available
in the classpath. Otherwise you would have to check whether the jar
exists, the class exists in it, and it's really on your classpath.

On Fri, Feb 20, 2015 at 5:27 AM, Mohammed Guller <moham...@glassbeam.com> wrote:
> Hi Kelvin,
>
>
>
> Yes. I am creating an uber jar with the Postgres driver included, but
> nevertheless tried both –jars and –driver-classpath flags. It didn’t help.
>
>
>
> Interestingly, I can’t use BoneCP even in the driver program when I run my
> application with spark-submit. I am getting the same exception when the
> application initializes BoneCP before creating SparkContext. It looks like
> Spark is loading a different version of the Postgres JDBC driver than the
> one that I am linking.
>
>
>
> Mohammed
>
>
>
> From: Kelvin Chu [mailto:2dot7kel...@gmail.com]
> Sent: Thursday, February 19, 2015 7:56 PM
> To: Mohammed Guller
> Cc: user@spark.apache.org
> Subject: Re: using a database connection pool to write data into an RDBMS
> from a Spark application
>
>
>
> Hi Mohammed,
>
>
>
> Did you use --jars to specify your jdbc driver when you submitted your job?
> Take a look of this link:
> http://spark.apache.org/docs/1.2.0/submitting-applications.html
>
>
>
> Hope this help!
>
>
>
> Kelvin
>
>
>
> On Thu, Feb 19, 2015 at 7:24 PM, Mohammed Guller <moham...@glassbeam.com>
> wrote:
>
> Hi –
>
> I am trying to use BoneCP (a database connection pooling library) to write
> data from my Spark application to an RDBMS. The database inserts are inside
> a foreachPartition code block. I am getting this exception when the code
> tries to insert data using BoneCP:
>
>
>
> java.sql.SQLException: No suitable driver found for
> jdbc:postgresql://hostname:5432/dbname
>
>
>
> I tried explicitly loading the Postgres driver on the worker nodes by adding
> the following line inside the foreachPartition code block:
>
>
>
> Class.forName("org.postgresql.Driver")
>
>
>
> It didn’t help.
>
>
>
> Has anybody able to get a database connection pool library to work with
> Spark? If you got it working, can you please share the steps?
>
>
>
> Thanks,
>
> Mohammed
>
>
>
>

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to