Hi,

Please see an example code in
https://github.com/gaborgsomogyi/spark-jdbc-connection-provider (
https://github.com/apache/spark/pull/29024).
Since it depends on the service loader, I think you need to add a
configuration file in META-INF/services.

Bests,
Takeshi

On Tue, Oct 27, 2020 at 9:50 PM rafaelkyrdan <rafaelkyr...@gmail.com> wrote:

> Guys do you know how I can use the custom implementation of
> JdbcConnectionProvider?
>
> As far as I understand in the spark jdbc we can use custom Driver, like
> this:
> *val jdbcDF = spark.read
>   .format("jdbc")
>   .option("url", "jdbc:postgresql:dbserver").option("driver", "my.drivier")
> *
> And we need a matching JdbcConnectionProvider which will override the
> property:
> * override val driverClass = "my.driver"*
>
> I have both but I see that they are not used. Do I need to register somehow
> them? Could someone share a relevant example?
> Thx.
>
>
>
>
> --
> Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>
>

-- 
---
Takeshi Yamamuro

Reply via email to