Re: Oracle JDBC - Spark SQL - Key Not Found: Scale

2017-01-26 Thread ayan guha
Hi I will do a little more testing and will let you know. It did not work with INT and Number types, for sure. While writing, everything is fine :) On Fri, Jan 27, 2017 at 1:04 PM, Takeshi Yamamuro wrote: > How about this? > https://github.com/apache/spark/blob/master/sql/core/ > src/test/scal

Re: Oracle JDBC - Spark SQL - Key Not Found: Scale

2017-01-26 Thread Takeshi Yamamuro
How about this? https://github.com/apache/spark/blob/master/sql/core/src/test/scala/org/apache/spark/sql/jdbc/JDBCSuite.scala#L729 Or, how about using Double or something instead of Numeric? // maropu On Fri, Jan 27, 2017 at 10:25 AM, ayan guha wrote: > Okay, it is working with varchar columns

Re: Oracle JDBC - Spark SQL - Key Not Found: Scale

2017-01-26 Thread ayan guha
Okay, it is working with varchar columns only. Is there any way to workaround this? On Fri, Jan 27, 2017 at 12:22 PM, ayan guha wrote: > hi > > I thought so too, so I created a table with INT and Varchar columns > > desc agtest1 > > Name Null Type > - > PID NUMBER(38)

Re: Oracle JDBC - Spark SQL - Key Not Found: Scale

2017-01-26 Thread ayan guha
hi I thought so too, so I created a table with INT and Varchar columns desc agtest1 Name Null Type - PID NUMBER(38) DES VARCHAR2(100) url="jdbc:oracle:thin:@mpimpclu1-scan:1521/DEVAIM" table = "agtest1" user = "bal" password= "bal" driver="oracle.jdbc.OracleDri

Re: Oracle JDBC - Spark SQL - Key Not Found: Scale

2017-01-26 Thread Takeshi Yamamuro
Hi, I think you got this error because you used `NUMERIC` types in your schema ( https://github.com/apache/spark/blob/master/sql/core/src/main/scala/org/apache/spark/sql/jdbc/OracleDialect.scala#L32). So, IIUC avoiding the type is a workaround. // maropu On Fri, Jan 27, 2017 at 8:18 AM, ayan gu

Oracle JDBC - Spark SQL - Key Not Found: Scale

2017-01-26 Thread ayan guha
Hi I am facing exact issue with Oracle/Exadataas mentioned here . Any idea? I could not figure out so sending to this grou hoping someone have see it (and solved it) Spark Version: 1.6 pyspark command: pyspark --driver-cla