For my own understanding, are you suggesting the FLINK-2944 (or a subtask)
is the appropriate place to implement exposure of metrics such as bytes,
records in, out of Streaming sources and sinks?
On Tue, Dec 15, 2015 at 5:24 AM, Niels Basjes wrote:
> Hi,
>
> @Ufuk: I added the env.disableOperato
Hi Martin,
>From a quick look into the source code, it seems like the nodes are
not necessarily available after the TransportClient has been created.
The sampling may take several attempts and the check immediately after
the first try is a bit restrictive.
Nevertheless, if this happens consistent
Sure, excuse me if anything was obvious or wrong, I know next to nothing
about Hadoop.
1. get the Hadoop 2.7 distribution (I set its path to HADOOP_HOME to make
things easier for mysellf)
2. set the HADOOP_CLASSPATH to include
${HADOOP_HOME}/share/hadoop/common/*:${HADOOP_HOME}/share/hadoop/tools/
Hi guys,
I have a question related to utilizing watermarks with multiple
FlinkKakfkaConsumer082 instances. The aim is to have a global watermark across
multiple kafka consumers where any message from any kafka partition would
update the same watermark. When testing a simple TimeStampExtractor
Hi everyone,
I'm trying to connect my flink streaming job to elastic search but I have
trouble to make it work.
Here is the config I'm using for the connector:
HashMap elConf = new HashMap<>();
elConf.put("bulk.flush.max.actions", "1");
elConf.put("cluster.name", "logelask");
List transports = ne
Hi,
@Ufuk: I added the env.disableOperatorChaining() and indeed now I see two
things on the screen and there are numbers counting what has happened.
@Stephan: Yes, I understand these numbers now.
I found that this is already a jira ticket to add what I was looking for:
https://issues.apache.org/j
Thanks for reporting!
Would you like to fix this and open a PR?
-Matthias
On 12/15/2015 04:43 AM, Radu Tudoran wrote:
> Hi,
>
>
>
> I believe i found 2 small inconsistencies in the documentation for the
> description of Window Apply
>
> https://ci.apache.org/projects/flink/flink-docs-relea
Hi,
I believe this question might have been asked before - so sorry for repeating
it (I just did not find the discussion on the mailing list).
Is it possible somehow to create a new DataStream from the elements that are
evicted from a window?
A simple use case for this is:
We have data
Hi,
If you wan't to play with it you can find the source and basic documentation
here: https://github.com/ottogroup/flink-spector.
The framework is for now feature complete. At the moment I'm working on
exposing some more functionality to the user, making the dsl more intuitive
and scalatest suppo