Re: sql program throw exception when new kafka with csv format

2018-12-11 Thread Timo Walther
Hi Marvin, the CSV format is not supported for Kafka so far. Only formats that have the tag `DeserializationSchema` in the docs are supported. Right now you have to implement you own DeserializationSchemaFactory or use JSON or Avro. You can follow [1] to get informed once the CSV format is

Re: sql program throw exception when new kafka with csv format

2018-12-11 Thread Hequn Cheng
Hi Marvin, I had taken a look at the Flink code. It seems we can't use CSV format for Kafka. You can use JSON instead. As the exception shows, Flink can't find a suitable DeserializationSchemaFactory. Currently, only JSON and Avro support DeserializationSchemaFactory. Best, Hequn On Tue, Dec 11,

sql program throw exception when new kafka with csv format

2018-12-11 Thread Marvin777
Register kafka message source with csv format, the error message is as follows: Exception in thread "main" org.apache.flink.table.api.NoMatchingTableFactoryException: Could not find a suitable table factory for 'org.apache.flink.table.factories.DeserializationSchemaFactory' in the classpath. Re