I added kafka tomy dependencies although i am not sure why this would be
required... seems to work
org.apache.kafka
kafka_${kafka.scala.version}
${kafka.version}
This is my full dependency l
I trying to connect to schema registry and deserialize the project.
I am building my project and on mvn build i get the error
class file for kafka.utils.VerifiableProperties not found...
import io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient;
import io.confluent.kafka.sch
I watched a video which was published a while back by Matt Zimmer in Berlin
Flink Forward (Sep 2017) titled Custom, Complex Windows at Scale using
Apache Flink.
https://www.youtube.com/watch?v=XUvqnsWm8yo
In this window he analyzes a custom implementation of window that he
implemented in order
By adding , AfterMatchSkipStrategy.skipPastLastEvent() it returns what i
want.
Is there a way to track/emit "ongoing" events i.e before the pattern matchs
the end event type?
--
Sent from: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/
I see you replied on
https://stackoverflow.com/questions/48028061/flink-cep-greedy-matching
with a known bug issue on the
https://issues.apache.org/jira/browse/FLINK-8914
In my case my pattern looks like
Pattern tripPattern =
Pattern.begin("start").times(1).where(START_CONDI
I have looked into the CEP library. I have posted an issued on
stackoverflow.
https://stackoverflow.com/questions/49047879/global-windows-in-flink-using-custom-triggers-vs-flink-cep-pattern-api
However the pattern matches all possible solution on the stream of
events.Does pattern have a notion o
Could someone clarify how exactly event time/watermarks and allow lateness
work. I have created the program below and I have an input file such as...
device_id,trigger_id,event_time,messageId
1,START,1520433909396,1
1,TRACKING,1520433914398,2
1,TRACKING,1520433919398,3
1,STOP,152