Hello,

I think I accidentally posted this question on the wrong email list (dev) so I 
am posting it again here.


I am struggling to run my test Flink project with Table API

I am trying to run a simple piece of code:

final StreamExecutionEnvironment env = 
StreamExecutionEnvironment.getExecutionEnvironment();
env.setRuntimeMode(RuntimeExecutionMode.BATCH);

StreamTableEnvironment tableEnv = StreamTableEnvironment.create(env);

Path in = Path.fromLocalFile(new File("part-v001-o000-r-00330.avro"));
AvroInputFormat<BidSample> users = new AvroInputFormat<>(in, BidSample.class);
DataStream<BidSample> bidsDS = env.createInput(users);

Table bidsTable = tableEnv.fromDataStream(bidsDS);
bidsTable.printSchema();


And here is my pom dependencies

<dependency>
   <groupId>org.apache.flink</groupId>
   <artifactId>flink-streaming-java_2.12</artifactId>
   <version>1.14.3</version>
</dependency>
<dependency>
   <groupId>org.apache.flink</groupId>
   <artifactId>flink-clients_2.12</artifactId>
   <version>1.14.3</version>
</dependency>
<dependency>
   <groupId>org.apache.flink</groupId>
   <artifactId>flink-table-api-java-bridge_2.12</artifactId>
   <version>1.14.3</version>
</dependency>
<dependency>
   <groupId>org.apache.flink</groupId>
   <artifactId>flink-table-planner_2.12</artifactId>
   <version>1.14.3</version>
</dependency>
<dependency>
   <groupId>org.apache.flink</groupId>
   <artifactId>flink-table-common</artifactId>
   <version>1.14.3</version>
</dependency>
<dependency>
   <groupId>org.apache.flink</groupId>
   <artifactId>flink-table-api-java</artifactId>
   <version>1.14.3</version>
</dependency>



But I keep getting below error:
Caused by: java.lang.ClassNotFoundException: 
org.apache.flink.connector.file.table.factories.BulkReaderFormatFactory
        at 
java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:766)
        at 
java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:178)



Any help to know why this is happening and fix it would be much appreciated.

Many thanks,
Adesh DSilva

Reply via email to