Hi team,
I have a Kafka topic named employee which uses confluent avro schema and
will emit the payload as below:
{
"employee": {
"id": "123456",
"name": "sampleName"
}
}
I am using the upsert-kafka connector to consume the events from the above
Kafka topic as below using the Flink SQL DDL statem
Hi,
I just wanted to confirm if there is really a role based access in flink?
We have linked it to our ldap but the requirement is, the administrators
should only be the people who could upload a jar file.
I am reading the documentation but I couldn't find it, or maybe I missed.
Regards,
Patri
Hello Everyone,
I am trying to implement a custom source where split detection is an
expensive operation and I would like to benefit from the split reader
results to build my next splits.
Basically, my source is taking as input an id from my external system,
let's name it ID1.
>From ID1, I can g
Hi,
Could you share the detailed exception stack ? Or Did you modify any job
logic or parameters ?
Currently, Flink only supports simple schema evolution (e.g. add or remove
fields for pojo types) for DataStream Jobs[1].
Other modifications may cause this exception, for example:
1. modify some sche
Hello everyone!
I need some help trying to implement a CEP Pattern.
This pattern detects if there are no messages from a device for two
consecutive days.
For this purpose, it checks that after receiving a measurement from a
device, no further measurement is received from the same device for 2 day
Thanks for the tips Martijn!
I've fixed the library versions to 1.16 everywhere and also decided to
scrap pyflink and go for the sql-client instead to keep things simpler for
now.
This is the Dockerfile I am using for both the *jobmanager* and the
*sql-client*
FROM flink:1.16.2-scala_2.12-java11