Hi everyone,
I’m currently using Flink to calculate a moving average with a window size of
60 days and a slide of 30 days, and it works ok. The data I have are non
continuous every month (a month there is a data and for 3/4 months or even more
there are no data). Is there a way to output the mo
Hi everyone,
I’m using the code below to calculate the moving average on some data.
Table averageJudgeByPhaseReport = filteredPhasesDurationsTable
.window(Slide.over(lit(WINDOW_SIZE_IN_DAYS_REPORT).days())
.every(lit(WINDOW_SLIDE_IN_DAYS_REPORT).days())
.on($(“en
ain, not sure how to configure this
>
> Ancora cari saluti
>
> Thias
>
>
>
>
>
> From: Eugenio Marotti
> Sent: Thursday, September 21, 2023 2:35 PM
> To: Schwalbe Matthias
> Cc: user@flink.apache.org
> Subject: Re: Window aggregation on tw
me.
>
> What do you think?
>
> Cari saluti
>
> Thias
>
>
> From: Eugenio Marotti
> Sent: Thursday, September 21, 2023 8:56 AM
> To: user@flink.apache.org
> Subject: Window aggregation on two joined table
>
> Hi,
>
> I’m trying to execute a wind
Hi,
I’m trying to execute a window aggregation on two joined table from two Kafka
topics (upsert fashion), but I get no output. Here’s the code I’m using:
This is the first table from Kafka with an event time watermark on ‘data_fine’
attribute:
final TableDescriptor phasesDurationsTableDescrip
Hi everyone,
is there a way to set Flink processing time in the past?
Thanks
Eugenio
Hi everyone,
I’m trying to calculate an average with a sliding window. Here’s the code I’m
using. First of all I receive a series of events from a Kafka topic. I declared
a watermark on the ‘data_fine’ attribute.
final TableDescriptor filteredPhasesDurationsTableDescriptor =
TableDescriptor.fo
Hi everyone,
I'm using Flink for processing some streaming data. First of all I have two
tables receiving events from Kafka. These tables are joined and the resulting
table is converted to a DataStream where it is processed by a custom
KeyedProcessFunction. The output is then converted to a tab
Hi,
I’m currently using the Opensearch Connector for the Table API. For testing I
need to disable the hostname verification. Is there a way to do this?
Thanks
Eugenio
I'm implementing a data analysis pipeline in Flink and I have a problem
converting a DataStream to a Table. I have this table defined from a join
between two Kafka sources:
Table legalFileEventsTable = legalFilesTable.join(eventsTable)
.where($("id").isEqual($("id_fascicolo")))
Hi everyone. I'm developing a monitoring app and I want to use Flink to process
the event stream. I need to start a timer when an event is received in Flink,
send the timer value and stop the timer when another event is received. Let me
explain better. An event consists of an event name, a sourc
11 matches
Mail list logo