Hi, I am trying to create a data streaming pipeline for real-time analytics.
Kafka -> Flink -> InfluDB -> Visualize
I am facing an issue writing the data to InfluxDB. I can see logs in
'taskmanager' after submitting the Job. But, the data is not being
written to InfluxDB for some reason.
Please h
After last checking it uses about 200-400 millicores each pod and 2.2Gb.
El mié, 29 ene 2025 a las 21:41, Guillermo Ortiz Fernández (<
guillermo.ortiz.f...@gmail.com>) escribió:
> I have a job entirely written in Flink SQL. The first part of the program
> processes 10 input topics and generates o
Unsubscribe
I have a job entirely written in Flink SQL. The first part of the program
processes 10 input topics and generates one output topic with normalized
messages and some filtering applied (really easy, some where by fields and
substring). Nine of the topics produce between hundreds and thousands of
mess
I am on Apache Flink 1.20, trying to use pure Java SQL API to create a catalog
and table in it. When I tried to do so I am getting the following error:
```
21:31:41,939 ERROR dev.kameshs.demos.flink.iceberg
[] - Error running job
java.lang.IllegalArgumentException: t