Hi Dominik,

I am pulling in Timo who might know more about this.

Cheers,
Till

On Mon, Aug 9, 2021 at 3:21 PM Dominik Wosiński <wos...@gmail.com> wrote:

> Hey all,
>
> I think I've hit some weird issue in Flink TypeInformation generation. I
> have the following code:
>
> val stream: DataStream[Event] = ...
> tableEnv.createTemporaryView("TableName",stream)
> val table = tableEnv
> .sqlQuery("SELECT id, timestamp, eventType from TableName")
> tableEnvironment.toAppendStream[NewEvent](table)
>
> In this particual example *Event* is an avro generated class and *NewEvent
> *is just POJO. This is just a toy example so please ignore the fact that
> this operation doesn't make much sense.
>
> When I try to run the code I am getting the following error:
>
>
>
>
>
> *org.apache.flink.table.api.ValidationException: Column types of query
> result and sink for unregistered table do not match.Cause: Incompatible
> types for sink column 'licence' at position 0.Query schema: [id:
> RAW('org.apache.avro.util.Utf8', '...'), timestamp: BIGINT NOT NULL, kind:
> RAW('org.test.EventType', '...')]*
>
> *Sink schema:  id: RAW('org.apache.avro.util.Utf8', '?'), timestamp:
> BIGINT, kind: RAW('org.test.EventType', '?')]*
>
> So, it seems that the type is recognized correctly but for some reason
> there is still mismatch according to Flink, maybe because of different type
> serializer used ?
>
> Thanks in advance for any help,
> Best Regards,
> Dom.
>

Reply via email to