Hi Igor!
What you can actually do is let a single FlinkKafkaConsumer consume from both
topics, producing a single DataStream which you can keyBy afterwards.
All versions of the FlinkKafkaConsumer support consuming multiple Kafka topics
simultaneously. This is logically the same as union and then
Hi Stephan:
Thanks for your reply. I think in standalone mode, the
Kerberos authentication should not depend on Hadoop. What about your opinion?
By the way, when the Flink 1.2.0 come up?
Thanks in advance!
发件人: Stephan Ewen [mailto:se.
Hi,
I am using 1.2-Snapshot version of Apache Flink which provides the new
enhanced Evictor functionality and using customized triggers for Global
Window. I have a use case where I am evicting the unwanted event(element)
for the current window before it is evaluated. However, I am looking for
opti
Hi,
I have usecase when I need to join two kafka topics together by some fields.
In general, I could put content of one topic into another, and partition by
same key, but I can't touch those two topics(i.e. there are other consumers
from those topics), on the other hand it's essential to process s
Hi Stephan,
I want to pursue your idea. How do I emit state from an operator. Operator
for me is a rich function. Or will I need a different style operator? I am
unable to find how to iterate over all state - in open or otherwise (from an
operator).
Are there APIs to inspect the savepoints - usi
Hi I was experimenting with the Query State feature and I have some
problems querying the state.
The code which I use to produce the queryable state is:
env.addSource(kafkaConsumer).map(
e => e match {
case LoginClickEvent(_, t) => ("login", 1, t)
case LogoutClickEvent(_
I don't think there is custom flink action.
You can use sshAction instead.
Thanks
Deepak
On Jan 8, 2017 17:31, "Malte Schwarzer" wrote:
> Hi all,
>
> does Flink currently support the integration into a Apache Oozie (
> https://oozie.apache.org/ ) workflow? Is there a Flink custom action for
> t
Hi all,
does Flink currently support the integration into a Apache Oozie (
https://oozie.apache.org/ ) workflow? Is there a Flink custom action for
that? If not what would be the best approach to do that? Use a shell
action and execute the Flink binary?
Best regards,
Malte