I would try double-checking whether the jdbc connector was truly bundled
in your jar, specifically whether
org.apache.flink.connector.jdbc.table.JdbcDynamicTableFactory is.
I can't think of a reason why this shouldn't work for the JDBC connector.
On 12/01/2022 06:34, Ronak Beejawat (rbeejawa) wrote:
Hi Chesnay,
How do you ensure that the connector is actually available at runtime?
We are providing below mentioned dependency inside pom.xml with scope compile
that will be available in class path and it was there in my fink job bundled
jar. Same we are doing the same for other connector say kafka it worked for that
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-jdbc_2.11</artifactId>
<version>1.14.2</version>
</dependency>
<dependency>
<groupId>mysql</groupId>
<artifactId>mysql-connector-java</artifactId>
<version>5.1.41</version>
</dependency>
Are you bundling it in a jar or putting it into Flinks lib directory?
Yes we are building jar it is bundled with that but still we saw this error .
So we tried the workaround which is mentioned in some article to put inside a
flink lib directory then it worked
https://blog.csdn.net/weixin_44056920/article/details/118110949 . So this is
extra stuff which we have to do to make it work with restart of cluster .
But the question is how it worked for kafka and not for jdbc ? I didn't put
kafka jar explicitly in flink lib folder
Note : I am using flink release 1.14 version for all my job execution /
implementation which is a stable version I guess
Thanks
Ronak Beejawat
From: Chesnay Schepler <ches...@apache.org<mailto:ches...@apache.org>>
Date: Tuesday, 11 January 2022 at 7:45 PM
To: Ronak Beejawat (rbeejawa) <rbeej...@cisco.com.INVALID<mailto:rbeej...@cisco.com.INVALID>>,
user@flink.apache.org<mailto:user@flink.apache.org>
<user@flink.apache.org<mailto:user@flink.apache.org>>
Cc: Hang Ruan <ruanhang1...@gmail.com<mailto:ruanhang1...@gmail.com>>, Shrinath Shenoy K (sshenoyk)
<sshen...@cisco.com<mailto:sshen...@cisco.com>>, Karthikeyan Muthusamy (karmuthu) <karmu...@cisco.com<mailto:karmu...@cisco.com>>, Krishna
Singitam (ksingita) <ksing...@cisco.com<mailto:ksing...@cisco.com>>, Arun Yadav (aruny) <ar...@cisco.com<mailto:ar...@cisco.com>>,
Jayaprakash Kuravatti (jkuravat) <jkura...@cisco.com<mailto:jkura...@cisco.com>>, Avi Sanwal (asanwal)
<asan...@cisco.com<mailto:asan...@cisco.com>>
Subject: Re: Could not find any factory for identifier 'jdbc'
How do you ensure that the connector is actually available at runtime?
Are you bundling it in a jar or putting it into Flinks lib directory?
On 11/01/2022 14:14, Ronak Beejawat (rbeejawa) wrote:
Correcting subject -> Could not find any factory for identifier 'jdbc'
From: Ronak Beejawat (rbeejawa)
Sent: Tuesday, January 11, 2022 6:43 PM
To: 'd...@flink.apache.org' <d...@flink.apache.org<mailto:d...@flink.apache.org>>;
'commun...@flink.apache.org' <commun...@flink.apache.org<mailto:commun...@flink.apache.org>>;
'user@flink.apache.org' <user@flink.apache.org<mailto:user@flink.apache.org>>
Cc: 'Hang Ruan' <ruanhang1...@gmail.com<mailto:ruanhang1...@gmail.com>>; Shrinath Shenoy K (sshenoyk)
<sshen...@cisco.com<mailto:sshen...@cisco.com>>; Karthikeyan Muthusamy (karmuthu) <karmu...@cisco.com<mailto:karmu...@cisco.com>>; Krishna
Singitam (ksingita) <ksing...@cisco.com<mailto:ksing...@cisco.com>>; Arun Yadav (aruny) <ar...@cisco.com<mailto:ar...@cisco.com>>;
Jayaprakash Kuravatti (jkuravat) <jkura...@cisco.com<mailto:jkura...@cisco.com>>; Avi Sanwal (asanwal)
<asan...@cisco.com<mailto:asan...@cisco.com>>
Subject: what is efficient way to write Left join in flink
Hi Team,
Getting below exception while using jdbc connector :
Caused by: org.apache.flink.table.api.ValidationException: Could not find any
factory for identifier 'jdbc' that implements
'org.apache.flink.table.factories.DynamicTableFactory' in the classpath.
Available factory identifiers are:
blackhole
datagen
filesystem
kafka
print
upsert-kafka
I have already added dependency for jdbc connector in pom.xml as mentioned
below:
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-jdbc_2.11</artifactId>
<version>1.14.2</version>
</dependency>
<dependency>
<groupId>mysql</groupId>
<artifactId>mysql-connector-java</artifactId>
<version>5.1.41</version>
</dependency>
Referred release doc link for the same
https://nightlies.apache.org/flink/flink-docs-release-1.14/docs/connectors/table/jdbc/
Please help me on this and provide the solution for it !!!
Thanks
Ronak Beejawat