Hi Yuval,

TypeConversions.fromDataTypeToLegacyInfo was only a utility to bridge between TypeInformation and DataType until TypeInformation is not exposed through the Table API anymore.

Beginning from Flink 1.13 the Table API is able to serialize the records to the first DataStream operator via toDataStream or toChangelogStream. Internally, it uses org.apache.flink.table.runtime.typeutils.ExternalTypeInfo for that. The binary representation is using internal data structures and conversion will be performed during serialization/deserialization:

conversion -> internal -> conversion

You have two possibilities:

1) You simply call `tableEnv.toDataStream(table).getType()` and pass this type on to the next operator.

2) You define your own TypeInformation as you would usually do it in DataStream API without Table API.

We might serialize `Row`s with `RowSerializer` again in the near future. But for now we went with the most generic solution that supports everything that can come out of Table API.

Regards,
Timo

On 04.06.21 15:12, Yuval Itzchakov wrote:
When upgrading to Flink 1.13, I ran into deprecation warnings on TypeConversions

image.png

The deprecation message states that this API will be deprecated soon, but does not mention the alternatives that can be used for these transformations.

My use case is that I have a table that needs to be converted into a DataStream[Row] and in turn I need to apply some stateful transformations on it. In order to do that I need the TypeInformation[Row] produced in order to pass into the various state functions.

@Timo Walther <mailto:twal...@apache.org> I would love your help on this.
--
Best Regards,
Yuval Itzchakov.

Reply via email to