Hi Norm,

Here is a documentation for JDBC connector, you can find the supported
properties there:
https://ci.apache.org/projects/flink/flink-docs-master/dev/table/connect.html#jdbc-connector

Regarding to your exception, you don't need to call `inAppendMode`. JDBC
sink support both append-mode and upsert-mode.

Besides of that, flink-jdbc currenlty doesn't support teradata dialect and
plugin dialect, I guess you have to adjust some code in `JDBCDialects`.

Best,
Jark


On Thu, 5 Mar 2020 at 23:46, Norm Vilmer (Contractor) <norm.vil...@wnco.com>
wrote:

> Thanks for the reply, Arvid. I changed the property names in my
> ConnectorDescriptor subclass to match what the validator wanted and now get:
>
>
>
> “Could not find a suitable table factory for
> 'org.apache.flink.table.factories.TableSinkFactory' in
>
> the classpath.
>
>
>
> Reason: No factory supports all properties.
>
>
>
> The matching candidates:
>
> org.apache.flink.api.java.io.jdbc.JDBCTableSourceSinkFactory
>
> Unsupported property keys:
>
> update-mode”
>
>
>
> The method suggested in the link you sent, registerTableSink, is
> deprecated in 1.10, so I was trying to use the following:
>
>
>
>         final StreamExecutionEnvironment env =
> StreamExecutionEnvironment.getExecutionEnvironment();
>
>         final StreamTableEnvironment tableEnv =
> StreamTableEnvironment.create(env);
>
>
>
>                                 . . .
>
>
>
>         tableEnv.connect(new Teradata())
>
>                 .withSchema(new Schema()
>
>                         .field("f0", DataTypes.VARCHAR(25))
>
>                         .field("f1", DataTypes.VARCHAR(10240))
>
>                         .field("f2", DataTypes.VARCHAR(10240)))
>
>                 .inAppendMode()
>
>                 .createTemporaryTable("staging");
>
>
>
>         Table table = tableEnv.fromDataStream(reportData);
>
>         table.insertInto("staging");
>
>
>
> Using the connect() method, I can see that the code attempts to use the
> JDBCTableSourceSinkFactory, but does not like ‘update-mode’.
>
>
>
> Do you have an example using connect() method? Thanks.
>
>
>
> *From:* Arvid Heise <ar...@ververica.com>
> *Sent:* Thursday, March 5, 2020 1:15 AM
> *To:* Norm Vilmer (Contractor) <norm.vil...@wnco.com>
> *Cc:* user@flink.apache.org
> *Subject:* EXTERNAL - Re: Teradata as JDBC Connection
>
>
>
> *Caution:** Sender is from outside SWA. Take caution before opening
> links/attachments or replying with sensitive data. If suspicious, forward
> to 'suspici...@wnco.com <suspici...@wnco.com>'.*
>
>
>
> Hi Norm,
>
>
>
> the error message already points to the main issue: your property names
> are not correct.
>
>
>
>
>
>
> *Unsupported property keys: drivername update-mode password dburl username*
>
>
>
> You should use the builder to properly configure the sink [1].
>
>
>
> [1]
> https://ci.apache.org/projects/flink/flink-docs-stable/dev/table/connect.html#jdbcappendtablesink
> <https://urldefense.proofpoint.com/v2/url?u=https-3A__ci.apache.org_projects_flink_flink-2Ddocs-2Dstable_dev_table_connect.html-23jdbcappendtablesink&d=DwMFaQ&c=dyyteaO_66X5RejcGgaVFCWGX8V6S6CQobBcYjo__mc&r=a8BqCmWrJ1FuU14JVrlQLeWdeeSBWSiCJA9Y5xTWafg&m=ILIsvs3cX8I33upELTePziHCwXxG7sEqYmnFw2t1wo0&s=U2_Gx3yaa1cEB-61AL3kn9-nsW0HSY2li4gghCJ6sx8&e=>
>
>
>
> On Thu, Mar 5, 2020 at 12:58 AM Norm Vilmer (Contractor) <
> norm.vil...@wnco.com> wrote:
>
> Same error with this change:
>
> public class Teradata extends ConnectorDescriptor {
>     /**
>      * Constructs a {@link ConnectorDescriptor}.
>      */
>     public Teradata() {
>         super("jdbc", 1, false);
>     }
>
>     @Override
>     protected Map<String, String> toConnectorProperties() {
>         Map<String, String> map = new HashMap<>();
>         map.put(JDBCValidator.CONNECTOR_DRIVER,
> "com.teradata.jdbc.TeraDriver");
>         map.put(JDBCValidator.CONNECTOR_URL,
> "jdbc:teradata://myserver/database=mydb,LOGMECH=LDAP");
>         map.put(JDBCValidator.CONNECTOR_USERNAME, "...");
>         map.put(JDBCValidator.CONNECTOR_PASSWORD, "...!");
>         return map;
>     }
> }
>
> -----Original Message-----
> From: Norm Vilmer (Contractor) <norm.vil...@wnco.com>
> Sent: Wednesday, March 4, 2020 10:37 AM
> To: user@flink.apache.org
> Subject: EXTERNAL - Teradata as JDBC Connection
>
> Caution: Sender is from outside SWA. Take caution before opening
> links/attachments or replying with sensitive data. If suspicious, forward
> to 'suspici...@wnco.com'.
>
> Using Flink 1.10 and coding in Java 11, is it possible use to write to
> Teradata in append mode? MySQL, PostgreSQL, and Derby are the only
> supported drivers listed. Thanks.
>
>
> https://urldefense.proofpoint.com/v2/url?u=https-3A__ci.apache.org_projects_flink_flink-2Ddocs-2Dstable_dev_table_connect.html-23connectors&d=DwIFAg&c=dyyteaO_66X5RejcGgaVFCWGX8V6S6CQobBcYjo__mc&r=a8BqCmWrJ1FuU14JVrlQLeWdeeSBWSiCJA9Y5xTWafg&m=kfV3arAbKYvpd5IvCtggkHsoDXKTgA1RrGMWrbcWZOo&s=n91D15kGNf9TDtKedGYD8EfDYxnvEzY8POgNtSE-icY&e=
>
> I created the ConnectorDescriptor below and am using it from
> tableEnvironment.connect() but get the exception shown below.
>
> public class Teradata extends ConnectorDescriptor {
>     /**
>      * Constructs a {@link ConnectorDescriptor}.
>      */
>     public Teradata() {
>         super("jdbc", 1, false);
>     }
>
>     @Override
>     protected Map<String, String> toConnectorProperties() {
>         Map<String, String> map = new HashMap<>();
>         map.put("Drivername", "com.teradata.jdbc.TeraDriver");
>         map.put("DBUrl",
> "jdbc:teradata://myserver/database=mydb,LOGMECH=LDAP");
>         map.put("Username", "...");
>         map.put("Password", "...");
>         return map;
>     }
> }
>
> org.apache.flink.table.api.NoMatchingTableFactoryException: Could not find
> a suitable table factory for
> 'org.apache.flink.table.factories.TableSinkFactory' in the classpath.
>
> Reason: No factory supports all properties.
>
> The matching candidates:
> org.apache.flink.api.java.io
> <https://urldefense.proofpoint.com/v2/url?u=http-3A__org.apache.flink.api.java.io&d=DwMFaQ&c=dyyteaO_66X5RejcGgaVFCWGX8V6S6CQobBcYjo__mc&r=a8BqCmWrJ1FuU14JVrlQLeWdeeSBWSiCJA9Y5xTWafg&m=ILIsvs3cX8I33upELTePziHCwXxG7sEqYmnFw2t1wo0&s=jh_kefZhcjjWQlORx4iqTm04AUk7oS3E-uvZ_WF0BOQ&e=>
> .jdbc.JDBCTableSourceSinkFactory
> Unsupported property keys:
> drivername
> update-mode
> password
> dburl
> username
>
> The following properties are requested:
> connector.property-version=1
> connector.type=jdbc
> dburl=jdbc:teradata://myserver/database=mydb,LOGMECH=LDAP
> drivername=com.teradata.jdbc.TeraDriver
> password=xxx
> schema.0.data-type=VARCHAR(1)
> schema.0.name
> <https://urldefense.proofpoint.com/v2/url?u=http-3A__schema.0.name&d=DwMFaQ&c=dyyteaO_66X5RejcGgaVFCWGX8V6S6CQobBcYjo__mc&r=a8BqCmWrJ1FuU14JVrlQLeWdeeSBWSiCJA9Y5xTWafg&m=ILIsvs3cX8I33upELTePziHCwXxG7sEqYmnFw2t1wo0&s=0s4w2AgvU1kRhcu9TsjUvgdgyMY0aEknUGrDGCYFrsI&e=>
> =f0
> schema.1.data-type=VARCHAR(1)
> schema.1.name
> <https://urldefense.proofpoint.com/v2/url?u=http-3A__schema.1.name&d=DwMFaQ&c=dyyteaO_66X5RejcGgaVFCWGX8V6S6CQobBcYjo__mc&r=a8BqCmWrJ1FuU14JVrlQLeWdeeSBWSiCJA9Y5xTWafg&m=ILIsvs3cX8I33upELTePziHCwXxG7sEqYmnFw2t1wo0&s=T_LW0Ver2w6EgQTqMs7rV7oCMRkwN8AvJOdbhMmPCYU&e=>
> =f1
> schema.2.data-type=VARCHAR(1)
> schema.2.name
> <https://urldefense.proofpoint.com/v2/url?u=http-3A__schema.2.name&d=DwMFaQ&c=dyyteaO_66X5RejcGgaVFCWGX8V6S6CQobBcYjo__mc&r=a8BqCmWrJ1FuU14JVrlQLeWdeeSBWSiCJA9Y5xTWafg&m=ILIsvs3cX8I33upELTePziHCwXxG7sEqYmnFw2t1wo0&s=K-Xi4gmb0cevmFUrdSp9y0Ckci8oD6S5BoVt0G--WKA&e=>
> =f2
> update-mode=append
> username=xxx
>
> The following factories have been considered:
> org.apache.flink.table.sinks.CsvBatchTableSinkFactory
> org.apache.flink.table.sinks.CsvAppendTableSinkFactory
> org.apache.flink.api.java.io
> <https://urldefense.proofpoint.com/v2/url?u=http-3A__org.apache.flink.api.java.io&d=DwMFaQ&c=dyyteaO_66X5RejcGgaVFCWGX8V6S6CQobBcYjo__mc&r=a8BqCmWrJ1FuU14JVrlQLeWdeeSBWSiCJA9Y5xTWafg&m=ILIsvs3cX8I33upELTePziHCwXxG7sEqYmnFw2t1wo0&s=jh_kefZhcjjWQlORx4iqTm04AUk7oS3E-uvZ_WF0BOQ&e=>
> .jdbc.JDBCTableSourceSinkFactory
> org.apache.flink.streaming.connectors.kafka.Kafka011TableSourceSinkFactory
> org.apache.flink.streaming.connectors.kafka.Kafka010TableSourceSinkFactory
> org.apache.flink.streaming.connectors.kafka.Kafka09TableSourceSinkFactory
>
> ******* CONFIDENTIALITY NOTICE *******
>
> This e-mail message and all attachments transmitted with it may contain
> legally privileged and confidential information intended solely for the use
> of the addressee. If the reader of this message is not the intended
> recipient, you are hereby notified that any reading, dissemination,
> distribution, copying, or other use of this message or its attachments is
> strictly prohibited. If you have received this message in error, please
> notify the sender immediately and delete this message from your system.
> Thank you.
>
> ******* CONFIDENTIALITY NOTICE *******
>
> This e-mail message and all attachments transmitted with it may contain
> legally privileged and confidential information intended solely for the use
> of the addressee. If the reader of this message is not the intended
> recipient, you are hereby notified that any reading, dissemination,
> distribution, copying, or other use of this message or its attachments is
> strictly prohibited. If you have received this message in error, please
> notify the sender immediately and delete this message from your system.
> Thank you.
>
> ******* CONFIDENTIALITY NOTICE *******
>
> This e-mail message and all attachments transmitted with it may contain
> legally privileged and confidential information intended solely for the use
> of the addressee. If the reader of this message is not the intended
> recipient, you are hereby notified that any reading, dissemination,
> distribution, copying, or other use of this message or its attachments is
> strictly prohibited. If you have received this message in error, please
> notify the sender immediately and delete this message from your system.
> Thank you.
>

Reply via email to