Just realized, my question was probably not clear enough. :-)

I understand that the Avro (or JSON for that matter) format can be ingested
as described here:
https://ci.apache.org/projects/flink/flink-docs-stable/dev/table/connect.html#apache-avro-format,
but this still requires the entire table specification to be written in the
"CREATE TABLE" section. Is it possible to just specify the Avro schema and
let Flink map it to an SQL table?

BTW, the above link is titled "Table API Legacy Connectors", so is this
still supported? Same question for YAML specification.

Thanks,
Sumeet

On Fri, Apr 2, 2021 at 8:26 AM Sumeet Malhotra <sumeet.malho...@gmail.com>
wrote:

> Hi,
>
> Is it possible to directly import Avro schema while ingesting data into
> Flink? Or do we always have to specify the entire schema in either SQL DDL
> for Table API or using DataStream data types? From a code maintenance
> standpoint, it would be really helpful to keep one source of truth for the
> schema somewhere.
>
> Thanks,
> Sumeet
>

Reply via email to