Hi using Apache Flink 1.8.0
I'm consuming events from Kafka using nothing fancy...
Properties props = new Properties();
props.setProperty("bootstrap.servers", kafkaAddress);
props.setProperty("group.id",kafkaGroup);
FlinkKafkaConsumer consumer = new FlinkKafkaConsumer<>(topic,
new SimpleStringSc
Hello,
We are currently facing an issue where we need to store the instance of the
watermark and timestamp assigner in the state while consumer from Kafka.
For that purpose we took a look at FlinkKafkaConsumerBase and noticed that
since the methods (snapshotState and initializeState from the
Ch
Hi to all,
in my use case I have a stream of POJOs with Date fields.
When I use Table API I get the following error:
Exception in thread "main" org.apache.flink.table.api.ValidationException:
SQL validation failed. Type is not supported: Date
at
org.apache.flink.table.calcite.FlinkPlannerImpl.vali
Sorry, I don't get your point. Before answering the question, I guess we
need to make sure what you exactly want.
BTW, have you read the document of checkpointing or state? [1] [2]
Maybe it could help.
1.
https://ci.apache.org/projects/flink/flink-docs-release-1.8/concepts/programming-model.html#
Hi Gordon, thank you.
The involved data structure is a complex abstraction owning a schema and
values, it declares private fields which should not be edited directly from
users. I'd say it's really akin to an Avro GenericRecord. How would you
approach the problem if you have to serialize/deserializ