I think you can pickup your custom build driver from the command line itself

Here I am using a custom build third-party driver to access Oracle Table
on-premisses from cloud

val jdbUrl =
"jdbc:datadirect:ddhybrid://"+HybridServer+":"+HybridPort+";hybridDataPipelineDataSource="+
hybridDataPipelineDataSource+";datasourceUserId="+OracleUserName+";datasourcePassword="+OracleUserPassword+";encryptionMethod=noEncryption;"

// Read Oracle table on prem
val OracleDF = Try(
     spark.read.
     format("jdbc").
     option("url", jdbUrl).
     option("dbtable", OracleSchema+"."+OracleTable).
     option("user", HybridServerUserName).
     option("password", HybridServerPassword).
     load()
    ) match {
                   case Success(df) => df
                   case Failure(e) =>
                     println(e)
                     sys.exit(1)
     }

if (OracleDF.take(1).isEmpty){
  println("\nSource table "+OracleSchema+"."+OracleTable +" is empty.
exiting")
  sys.exit(1)
}

However, on spark-submit, I explicitly specify the driver class path

 spark-shell --driver-class-path /home/hduser/jars/ddhybrid.jar --jars ..

Assuming I understood your question correctly.

HTH



LinkedIn * 
https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
<https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*





*Disclaimer:* Use it at your own risk. Any and all responsibility for any
loss, damage or destruction of data or any other property which may arise
from relying on this email's technical content is explicitly disclaimed.
The author will in no case be liable for any monetary damages arising from
such loss, damage or destruction.




On Tue, 27 Oct 2020 at 12:51, rafaelkyrdan <rafaelkyr...@gmail.com> wrote:

> Guys do you know how I can use the custom implementation of
> JdbcConnectionProvider?
>
> As far as I understand in the spark jdbc we can use custom Driver, like
> this:
> *val jdbcDF = spark.read
>   .format("jdbc")
>   .option("url", "jdbc:postgresql:dbserver").option("driver", "my.drivier")
> *
> And we need a matching JdbcConnectionProvider which will override the
> property:
> * override val driverClass = "my.driver"*
>
> I have both but I see that they are not used. Do I need to register somehow
> them? Could someone share a relevant example?
> Thx.
>
>
>
>
> --
> Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>
>

Reply via email to