Hi,
I upgraded the project to Flink 1.18.0 and Java 17. I am also using
flink-kafka-connector 3.0.1-1.18 from mvn repository.
However, running it shows error:
Unable to make field private final java.lang.Object[]
java.util.Arrays$ArrayList.a accessible: module java.base does not "opens
java.uti
Hi, Dale.
I think there are two choices to try.
1. As the reply in #22427[1], use the SQL function `COALESCE`.
2. Modify the code in Avro format by yourself.
There is some work to do for the choice 2. First, you need to pass the
default value in Schema, which does not contain the default value no
Hello
I'm getting a dependency error when using the latest Kafka connector in
a Scala project.
Using the 1.17.1 Kafka connector compilation is ok.
With
"org.apache.flink" % "flink-connector-kafka" % "3.0.1-1.18"
I get
[error] (update) sbt.librarymanagement.ResolveException: Error
downloadi
I have a Kafka topic with events produced using an Avro schema like this:
{
"namespace": "demo.avro",
"type": "record",
"name": "MySimplifiedRecreate",
"fields": [
{
"name": "favouritePhrase",
"type": "string",
"default": "Hello World"
Hey Alexander,
Thanks for the feedback and apologies for my late reply.
This validates my understanding of AT_LEAST_ONCE wrt the kafka producer.
I tried to reproduce the issue, but came back empty handed. As you
pointed out the culprit could be a call to an external,
non-idempotent, api.
I'll f
Hi, Razin.
It seems like the issue you shared is a different problem from yours. They
have different error messages.
Have you ever tried to consume this topic using the Kafka java client[1] by
yourself to make sure you could access the topic normally?
Best,
Hang
[1] https://developer.confluent.i