[ 
https://issues.apache.org/jira/browse/FLINK-21841?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17303802#comment-17303802
 ] 

JieFang.He edited comment on FLINK-21841 at 3/18/21, 9:13 AM:
--------------------------------------------------------------

I use the maven plug “maven-assembly-plugin” to package the dependencies and 
code together into a jar。The pom like that
{code:java}
<dependencies>
        <dependency>
      <groupId>org.apache.flink</groupId>
      <artifactId>flink-sql-connector-kafka_2.11</artifactId>
      <version>1.11.1</version>
    </dependency>
    <dependency>
      <groupId>org.apache.flink</groupId>
      <artifactId>flink-connector-kafka_2.11</artifactId>
      <version>1.11.1</version>
      <scope>provided</scope>
    </dependency>
</dependencies>
...
</plugins>
        <plugin>
        <artifactId>maven-assembly-plugin</artifactId>
        <executions>
          <execution>
            <phase>package</phase>
            <goals>
              <goal>single</goal>
            </goals>
          </execution>
        </executions>
        <configuration>
          <descriptorRefs>
            <descriptorRef>jar-with-dependencies</descriptorRef>
          </descriptorRefs>
          <archive>
            <manifest>
              <addClasspath>true</addClasspath>
              <classpathPrefix>lib/</classpathPrefix>
              <mainClass>DimTable.DimTable</mainClass>
            </manifest>
          </archive>
        </configuration>
        </plugin>
</plugins>
{code}
I also find that the jdbc-connector has no problem, Only kafka-connector get 
the exception, And also no problem when use DataStream on kafka-connector。

It seems that the FactoryUtil did not find the KafkaDynamicTableFactory

 


was (Author: hejiefang):
I use the maven plug “maven-assembly-plugin” to package the dependencies and 
code together into a jar。The pom like that
{code:java}
<dependencies>
        <dependency>
      <groupId>org.apache.flink</groupId>
      <artifactId>flink-sql-connector-kafka_2.11</artifactId>
      <version>1.11.1</version>
    </dependency>
    <dependency>
      <groupId>org.apache.flink</groupId>
      <artifactId>flink-connector-kafka_2.11</artifactId>
      <version>1.11.1</version>
      <scope>provided</scope>
    </dependency>
</dependencies>
...
</plugins>
        <plugin>
        <artifactId>maven-assembly-plugin</artifactId>
        <executions>
          <execution>
            <phase>package</phase>
            <goals>
              <goal>single</goal>
            </goals>
          </execution>
        </executions>
        <configuration>
          <descriptorRefs>
            <descriptorRef>jar-with-dependencies</descriptorRef>
          </descriptorRefs>
          <archive>
            <manifest>
              <addClasspath>true</addClasspath>
              <classpathPrefix>lib/</classpathPrefix>
              <mainClass>DimTable.DimTable</mainClass>
            </manifest>
          </archive>
        </configuration>
        </plugin>
</plugins>
{code}
I also find that the jdbc-connector has no problem, Only kafka-connector get 
the exception, And also no problem when use DataStream on kafka-connector。

 

It seems that the FactoryUtil did not find the KafkaDynamicTableFactory

 

> Can not find kafka-connect with sql-kafka-connector
> ---------------------------------------------------
>
>                 Key: FLINK-21841
>                 URL: https://issues.apache.org/jira/browse/FLINK-21841
>             Project: Flink
>          Issue Type: Bug
>          Components: Connectors / Kafka, Table SQL / Ecosystem
>    Affects Versions: 1.11.1
>            Reporter: JieFang.He
>            Priority: Major
>
>  
> When use sql-kafka with fat-jar(make flink-sql-connector-kafka_2.11 in user 
> jar) with flink 1.11.1  like
> {code:java}
> CREATE TABLE user_behavior (
>  user_id INT,
>  action STRING,
>  province INT,
>  ts TIMESTAMP(3)
> ) WITH (
>  'connector' = 'kafka',
>  'topic' = 'intopic',
>  'properties.bootstrap.servers' = 'kafkaserver:9092',
>  'properties.group.id' = 'testGroup',
>  'format' = 'csv',
>  'scan.startup.mode' = 'earliest-offset'
> )
> {code}
>  I get a exception
> {code:java}
> Caused by: org.apache.flink.table.api.ValidationException: Cannot discover a 
> connector using option ''connector'='kafka''.
>         at 
> org.apache.flink.table.factories.FactoryUtil.getDynamicTableFactory(FactoryUtil.java:329)
>         at 
> org.apache.flink.table.factories.FactoryUtil.createTableSource(FactoryUtil.java:118)
>         ... 35 more
> Caused by: org.apache.flink.table.api.ValidationException: Could not find any 
> factory for identifier 'kafka' that implements 
> 'org.apache.flink.table.factories.DynamicTableSourceFactory' in the 
> classpath.Available factory identifiers are:datagen
> {code}
> It looks like the issue 
> [FLINK-18076|https://issues.apache.org/jira/browse/FLINK-18076] is not deal 
> with all exceptions
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Reply via email to