Thanks for you response!
Yes, I think to 99.9% there shouldn't be a "late event" and I would also
implement a logic in the ProcessFunction, which checks for a specific order
of the events per transaction id.
Using the clear() function for the state should free the ressources and
using that many
Hi,
I have a question to a specific use case and hope that you can help me with
that. I have a streaming pipeline and receive events of different types. I
parse the events to their internal representation and do some
transformations on them. Some of these events I want to collect internally
(groupe
Hey Timo,
when I try to access the array in the case class via the SQL syntax, I get
back an error that the syntax is invalid. Here is an example of the case
class structure:
case class Envelope(name: String, entity: Product)
case class Product(name: String, items: List[Item])
case class Item(attr
Hey Timo,
thanks for your warm welcome and for creating a ticket to fix this!
My scenario is the following:
I receive different JSON entities from an AMQP queue. I have a source to
collect the events, after that I parse them into the different internal case
classes and split the stream via the spl
Hey,
I need some help regarding the sql and table api. I'm using Apache Flink
1.3.2 with scala 2.11.11 (also tried 2.11.8) and I created a DataStream
(based on a scala case class) and I registered this as a table. The case
class includes some Lists, because the underlying JSON has some Arrays in
t