Hi Jane,
Thanks for your response. Yes it did throw a parsing error (Apache calcite-
flink internally uses it I guess).
Since, I am creating this flink table by consuming a Kafka topic, I don't
have the ability to change the avro schema , maybe I can check the
possibility of introducing a new fie
Hi Elakiya,
Did you encounter a ParserException when executing the DDL? AFAIK, Flink
SQL does not support declaring a nested column (compound identifier) as
primary key at syntax level.
A possible workaround is to change the schema to not contain record type,
then you can change the DDL to the fo
Hi Hang,
Once again thanks for your response, but I think you have misunderstood my
question. At present we are only using the DDL format of Table API and the
only issue we face is , we are unable to set the primary key field for the
Flink table since the value we want to use as primary key is pre
Hi, Elakiya.
If everything is right for the KafkaTable, I think there must be a
`user_id` field in the Kafka message key.
We could see the code in the method `createKeyValueProjections` of
`UpsertKafkaDynamicTableFactory` as follows.
```
private Tuple2
createKeyValueProjections(ResolvedCatalo
Hi Hang,
The select query works fine absolutely, we have also implemented join
queries which also works without any issues.
Thanks,
Elakiya
On Mon, Jul 10, 2023 at 2:03 PM Hang Ruan wrote:
> Hi, Elakiya.
>
> Maybe this DDL could be executed. Please execute the select sql `select *
> from Kafk
Hi, Elakiya.
Maybe this DDL could be executed. Please execute the select sql `select *
from KafkaTable`. Then I think there will be some error or the `user_id`
will not be read correctly.
Best,
Hang
elakiya udhayanan 于2023年7月10日周一 16:25写道:
> Hi Hang Ruan,
>
> Thanks for your response. But in t
Hi Hang Ruan,
Thanks for your response. But in the documentation, they have an example of
defining the Primary Key for the DDL statement (code below). In that case
we should be able to define the primary key for the DDL rite. We have
defined the primary key in our earlier use cases when it wasn't
Hi, elakiya.
The upsert-kafka connector will read the primary keys from the Kafka
message keys. We cannot define the fields in the Kafka message values as
the primary key.
Best,
Hang
elakiya udhayanan 于2023年7月10日周一 13:49写道:
> Hi team,
>
> I have a Kafka topic named employee which uses confluen
Hi team,
I have a Kafka topic named employee which uses confluent avro schema and
will emit the payload as below:
{
"employee": {
"id": "123456",
"name": "sampleName"
}
}
I am using the upsert-kafka connector to consume the events from the above
Kafka topic as below using the Flink SQL DDL statem
Hi elakiya,
I think you may need to spread the columns in key and value format, then
you can use the specific column as a primary key in the ddl.
Best,
Shammon FY
On Fri, Jun 23, 2023 at 6:36 PM elakiya udhayanan
wrote:
> Hi team,
>
> I have a Kafka topic named employee which uses confluent av
Hi team,
I have a Kafka topic named employee which uses confluent avro schema and
will emit the payload as below:
{
"employee": {
"id": "123456",
"name": "sampleName"
}
}
I am using the upsert-kafka connector to consume the events from the above
Kafka topic as below using the Flink SQL DDL statem
11 matches
Mail list logo