Hi Xuyang,
Thanks again for giving me some insights on how to use the Datastream API
for my use case, I will explore it and experiment with it.
I wanted to use the value inside the row datatype as a primary key because,
I might get multiple records for the same id and when I try to make a join
wi
Hi, Elakiya,
I think you can get what you need here[1] with many examples briging DataStream
api and Table API.
There may be some redundancy, and I'm not sure this is a best way to resolve
the question. First, use the StreamTableEnvironment to execute your original
ddl without pk.
Second, us
Hi Xuyang,
Thank you for your response. Since, I have no access to create a ticket in
the ASF jira I have requested for the access and once I get the access will
raise a ticket for the same.
Also, you have asked me to use Datastream API to extract the id and then
use the TableAPI feature, since I
Hi team,
I have a Kafka topic named employee which uses confluent avro schema and
will emit the payload as below:
{
"employee": {
"id": "123456",
"name": "sampleName"
}
}
I am using the upsert-kafka connector to consume the events from the above
Kafka topic as below using the Flink SQL DDL statem