; message should always be accumulate message)
>
> Best, JingsongLee
>
> --
> From:Caizhi Weng
> Send Time:2019年7月16日(星期二) 09:52
> To:sri hari kali charan Tummala
> Cc:user
> Subject:Re: Stream to CSV Sink with S
should always be accumulate message)
>
> Best, JingsongLee
>
> --
> From:Caizhi Weng
> Send Time:2019年7月16日(星期二) 09:52
> To:sri hari kali charan Tummala
> Cc:user
> Subject:Re: Stream to CSV Sink with SQL Distinct
--
From:Caizhi Weng
Send Time:2019年7月16日(星期二) 09:52
To:sri hari kali charan Tummala
Cc:user
Subject:Re: Stream to CSV Sink with SQL Distinct Values
Hi Kali,
Currently Flink treats all aggregate functions as retractable. As `distinct` is
an aggregate function, it's considered by the pl
Hi Weng,
another issue now (Exception in thread "main"
org.apache.flink.table.api.TableException: Only tables that originate from
Scala DataStreams can be converted to Scala DataStreams.), here is the full
code
https://github.com/kali786516/FlinkStreamAndSql/blob/15e5e60d6c044bc830f5ef2d79c071389e
Hi Kali,
Currently Flink treats all aggregate functions as retractable. As
`distinct` is an aggregate function, it's considered by the planner that it
might update or retract records (although from my perspective it won't...).
Because csv table sink is an append only sink (it's hard to update what
Hi All,
I am trying to read data from kinesis stream and applying SQL
transformation (distinct) and then tryting to write to CSV sink which is
failinf due to this issue (org.apache.flink.table.api.TableException:
AppendStreamTableSink requires that Table has only insert changes.) , full
code is he