Hi to all,
we've successfully ran our first straming job on a Flink cluster (with some
problems with the shading of guava..) and it really outperforms Logstash,
from the point of view of indexing speed and easiness of use.

However there's only one problem: when the job is running, in the Job
Monitoring UI, I see 2 blocks within the plan visualizer:

   1. Source: Custom File Source (without any info about the file I'm
   reading)
   2. Split Reader: Custom File source -> Sink: unnamed

None of them helps me to understand which data I'm reading or writing
(while within the batch jobs this is usually displayed). Moreover, in the
task details the "Byte sent/Records sent" are totally senseless, I don't
know what is counted (see the attached image if available)...I see
documents indexed on ES but in the Flink Job UI I don't see anything that
could help to understand how many documents are sent to ES or from one
function (Source) to the other (Sink).
I tried to display some metrics and there I found something but I hope this
is not the usual way of monitoring streaming jobs...am I doing something
wrong? Or the streaming jobs should be monitored with something else?

[image: Inline image 1]
Best,
Flavio

Reply via email to