It's probably OK, IMHO. The overhead of another dialect is small. Are there differences that require a new dialect? I assume so and might just be useful to summarize them if you open a PR.
On Tue, Dec 10, 2019 at 7:14 AM Bryan Herger <bryan.her...@microfocus.com> wrote: > > Hi, I am a Vertica support engineer, and we have open support requests around > NULL values and SQL type conversion with DataFrame read/write over JDBC when > connecting to a Vertica database. The stack traces point to issues with the > generic JDBCDialect in Spark-SQL. > > I saw that other vendors (Teradata, DB2...) have contributed a JDBCDialect > class to address JDBC compatibility, so I wrote up a dialect for Vertica. > > The changeset is on my fork of apache/spark at > https://github.com/bryanherger/spark/commit/84d3014e4ead18146147cf299e8996c5c56b377d > > I have tested this against Vertica 9.3 and found that this changeset > addresses both issues reported to us (issue with NULL values - setNull() - > for valid java.sql.Types, and String to VARCHAR conversion) > > Is the an acceptable change? If so, how should I go about submitting a pull > request? > > Thanks, Bryan Herger > Vertica Solution Engineer > > > --------------------------------------------------------------------- > To unsubscribe e-mail: dev-unsubscr...@spark.apache.org > --------------------------------------------------------------------- To unsubscribe e-mail: dev-unsubscr...@spark.apache.org