Hi Tao,

This is currently not possible using Table API, though this will likely
change in a future version. Currently, you would have to do that using the
Datastream API [1] and then switch to the Table API.

Best wishes,
Nico

[1]
https://ci.apache.org/projects/flink/flink-docs-release-1.13/docs/connectors/datastream/kafka/

On Sun, Jun 6, 2021 at 9:55 AM tao xiao <xiaotao...@gmail.com> wrote:

> Hi team,
>
> I want to use avro-confluent to encode the data using SQL but the schema
> registered by the encoder hard code the schema name to 'record'. is it
> possible to dictate the name?
>
> --
> Regards,
> Tao
>

Reply via email to