Hi, Jark & Timo. I'm glad to support this feature, and if you guys agree,
I'll be ready to create a FLIP, and then you guys and other developers can
review and check some specifics.
Thanks.
Jark Wu 于2022年8月30日周二 20:24写道:
> Thank you Ran for the explanation.
>
> The column DEFAULT is a reasonabl
Thank you Ran for the explanation.
The column DEFAULT is a reasonable feature and can also help in other cases.
I’m fine with adding this feature.
Do you want to prepare a FLIP for it?
Best,
Jark
> 2022年8月29日 15:02,Ran Tao 写道:
>
> Hi Jack. Timo summed it up very well. In fact, my problem is
Hi Jack. Timo summed it up very well. In fact, my problem is that the
current flink table metadata is fixed and cannot be compatible with the
connector's changes in metadata columns.
A metadata column that did not exist in the past, does exist at some point
in the future, and vice versa.
There is f
Your understanding is correct. In fact, my question is very simple, that
is, the metadatas of the flink table is now fixed and cannot be compatible
with the changes of the connector.
What you said about forward compatibility and backward compatibility is
very accurate, the 'DEFAULT' constraint is
Hi Ran,
so if I understand it correctly, the problem here is not only backward
compatibility but also forward compatibility. You might run different
versions of your connector some of them offer a metadata key A and some
don't offer it yet. But the DDL should work for both connector
implement
Hi Ran,
If the metadata is from the message properties, then you can manually cast it
to your preferred types,
such as `my_dyanmic_meta AS CAST(properties['my-new-property’] AS TIMESTAMP)`.
If the metadata is not from the message properties, how does the connector know
which field to convert f
Hi, TiMo. I think using one map column in the debezium format you
illustrated above can't cover the discussed scenario.
It's not the same thing.
Here is a debezium format example from flink docs: [1]
```
CREATE TABLE KafkaTable (
origin_ts TIMESTAMP(3) METADATA FROM 'value.ingestion-timestamp'
Hi Ran,
what would be the data type of this dynamic metadata column? The planner
and many parts of the stack will require a data type.
Personally, I feel connector developers can already have the same
functionality by declaring a metadata column as `MAP`.
This is what we expose already as `d
```
create table test_source(
__test_metadata__ varchar METADATA,
f0 varchar,
f1 varchar,
f2 bigint,
ts as CURRENT_TIMESTAMP
) with(
'connector'='test',
...
)
```
If we not pre define `__test_metadata__` as meta keys by implementing
listReadableMetadata, run the above sql, it will cause ex