Could you try 2.2? We fixed multiple Oracle related issues in the latest
release.

Thanks

Xiao


On Wed, 19 Jul 2017 at 11:10 PM Cassa L <lcas...@gmail.com> wrote:

> Hi,
> I am trying to use Spark to read from Oracle (12.1) table using Spark 2.0.
> My table has JSON data.  I am getting below exception in my code. Any clue?
>
> >>>>>
> java.sql.SQLException: Unsupported type -101
>
> at
> org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$.org$apache$spark$sql$execution$datasources$jdbc$JdbcUtils$$getCatalystType(JdbcUtils.scala:233)
> at
> org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$8.apply(JdbcUtils.scala:290)
> at
> org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$8.apply(JdbcUtils.scala:290)
> at scala.Option.getOrElse(Option.scala:121)
> at
>
> ==========
> My code is very simple.
>
> SparkSession spark = SparkSession
>         .builder()
>         .appName("Oracle Example")
>         .master("local[4]")
>         .getOrCreate();
>
> final Properties connectionProperties = new Properties();
> connectionProperties.put("user", *"some_user"*));
> connectionProperties.put("password", "some_pwd"));
>
> final String dbTable =
>         "(select *  from  MySampleTable)";
>
> Dataset<Row> jdbcDF = spark.read().jdbc(*URL*, dbTable, connectionProperties);
>
>

Reply via email to