e name as a
> host, because that is how Docker Compose works. It is a common mistake to
> target “localhost”, which stays inside the container.
>
>
>
> Nix.
>
>
>
> From: Siva Krishna
> Date: Thursday, January 30, 2025 at 11:07 AM
> To: Nikola Milutinovi
Hi, I am trying to create a data streaming pipeline for real-time analytics.
Kafka -> Flink -> InfluDB -> Visualize
I am facing an issue writing the data to InfluxDB. I can see logs in
'taskmanager' after submitting the Job. But, the data is not being
written to InfluxDB for some reason.
Please h
86/…”. If it is “localhost:8086…” that would be a
> problem.
>
>
>
> Do you see a connection on InfluxDB? Anything in the logs? Anything in the
> logs of Job Mnager?
>
>
>
> Nix,
>
>
>
> From: Siva Krishna
> Date: Thursday, January 30, 2025 at 5:12
should be aware that
> if you want to track progress of data in your pipeline, you need to log
> statements inside your processors, sinks,… On our project we made a small
> LogProcessingFunction, which was just outputting log statement we set on
> its constructor.
>
>
>
&g
yFlink
> JAR. Not sure what is wrong with it (yes, I know it should be
> file:///opt/flink..., but I’ve seen both variants). In any case, it is
> not your fault. At least, not directly.
>
>
>
> Anyway, I can confirm that we have been successfully launching PyFlink
> jobs from J
Hi guys,
I am facing an issue with running the python job inside the Flink docker
container.
I confirm that python is properly installed.
I am trying to submit python with below command,
/opt/flink/bin/flink run -py /opt/flink/flink_job/main.py
no matter how many times I try, I am getting below e