I'm trying to run a Flink job as a standalone program and getting the
following error.

Caused by: org.apache.flink.table.api.ValidationException: Could not find
any factory for identifier 'filesystem' that implements
'org.apache.flink.table.factories.DynamicTableFactory' in the classpath.

Available factory identifiers are:

blackhole
datagen
kafka
print
upsert-kafka

Here's a list of dependencies (Flink version is 1.16.0)

libraryDependencies ++= Seq(
  // Flink
  "org.apache.flink" %% "flink-streaming-scala" % versions.flink,
  "org.apache.flink" %% "flink-table-api-scala" % versions.flink,
  "org.apache.flink" %% "flink-table-api-scala-bridge" % versions.flink,
  "org.apache.flink" % "flink-connector-kafka" % versions.flink,
  "org.apache.flink" % "flink-avro-confluent-registry" % versions.flink,
  "org.apache.flink" %% "flink-table-planner" % versions.flink,
  "org.apache.flink" % "flink-avro" % versions.flink,
  "org.apache.flink" % "flink-clients" % versions.flink,
  "org.apache.flink" % "flink-runtime" % versions.flink,
  "org.apache.flink" % "flink-runtime-web" % versions.flink,
  "org.apache.flink" % "flink-parquet" % versions.flink,
  // Hadoop
  "org.apache.hadoop" % "hadoop-client" % versions.hadoop,
  // Misc
  "org.rogach" %% "scallop" % "4.1.0",
  "ch.qos.logback" % "logback-classic" % "1.4.4"
)

I've also tried to run it on a host that has Hadoop installed setting
HADOOP_CLASSPATH and HADOOP_CONF_DIR but the result is the same.

Reply via email to