Looks you need to add an "driver" option to your codes, such as
sqlContext.read.format("jdbc").options(
Map("url" -> "jdbc:oracle:thin:@:1521:xxx",
"driver" -> "oracle.jdbc.driver.OracleDriver",
"dbtable" -> "your_table_name")).load()
Best Regards,
Shixiong Zhu
2015-12-21 6:0
Please make sure this is correct jdbc url,
jdbc:oracle:thin:@:1521:xxx
On Mon, Dec 21, 2015 at 9:54 PM, Madabhattula Rajesh Kumar <
mrajaf...@gmail.com> wrote:
> Hi Jeff and Satish,
>
> I have modified script and executed. Please find below command
>
> ./spark-submit --master local --class
Hi Jeff and Satish,
I have modified script and executed. Please find below command
./spark-submit --master local --class test.Main --jars
/home/user/download/jar/ojdbc7.jar
/home//test/target/spark16-0.0.1-SNAPSHOT.jar
Still I'm getting same exception.
Exception in thread "main" java.sql.SQLE
Hi Rajesh,
Could you please try giving your cmd as mentioned below:
./spark-submit --master local --class --jars
Regards,
Satish Chandra
On Mon, Dec 21, 2015 at 6:45 PM, Madabhattula Rajesh Kumar <
mrajaf...@gmail.com> wrote:
> Hi,
>
> How to add dependent jars in spark-submit command. For
Put /test/target/spark16-0.0.1-SNAPSHOT.jar as the last argument
./spark-submit --master local --class test.Main --jars
/home/user/download/jar/ojdbc7.jar /test/target/spark16-0.0.1-SNAPSHOT.jar
On Mon, Dec 21, 2015 at 9:15 PM, Madabhattula Rajesh Kumar <
mrajaf...@gmail.com> wrote:
> Hi,
>
>