Hi Pramit:
AppendStreamTableSink defines an external TableSink to emit a streaming table
with only insert changes. If the table is also modified by update or delete
changes, a TableException will be thrown.[1]
Your sql seems have update or delete changes.
You can try to use
Hi,
I am attempting the following:
String sql = "INSERT INTO table3 "
+ "SELECT col1, col2, window_start_time , window_end_time ,
MAX(col3), MAX(col4), MAX(col5) FROM "
+ "(SELECT col1,col2, "
+ "TUMBLE_START(ts, INTERVAL '1' MINUTE) as
window_start_time, "
Bill Lee,
Man, you saved me from headbanging :) Thank you!
2018-03-10 0:25 GMT+02:00 杨力 :
> To use a field in a table as timestamp, it must be declared as a rowtime
> attribute for the table.
>
> 1) Call env.setStreamTimeCharacteristic(TimeCharacteristic.EventTime).
> 2) Call withRowtimeAttribut
To use a field in a table as timestamp, it must be declared as a rowtime
attribute for the table.
1) Call env.setStreamTimeCharacteristic(TimeCharacteristic.EventTime).
2) Call withRowtimeAttribute on KafkaJsonTableSourceBuilder.
Reference:
1.
https://ci.apache.org/projects/flink/flink-docs-relea
Hi everyone!
I decided to try the Time-windowed join functionality of Flink 1.4+.
My SQL query is an exact copy of the example in the documentation, and the
program reads and writes from Kafka.
I used the example from here:
https://ci.apache.org/projects/flink/flink-docs-release-1.4/dev/table/sq
Thanks! don't know this works as well.
Cheers,
Sendoh
--
Sent from: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/
Yes.
Adding .returns(typeInfo) works as well. :-)
2017-12-08 11:29 GMT+01:00 Fabian Hueske :
> Hi,
>
> you give the TypeInformation to your user code but you don't expose it to
> the DataStream API (the code of the FlatMapFunction is a black box for the
> API).
> You're FlatMapFunction should imp
Hi,
you give the TypeInformation to your user code but you don't expose it to
the DataStream API (the code of the FlatMapFunction is a black box for the
API).
You're FlatMapFunction should implement the ResultTypeQueryable interface
and return the TypeInformation.
Best, Fabian
2017-12-08 11:19 G
Found it. I should use .returns(typeInformation) after the map function.
--
Sent from: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/
Hi Flink users,
I found the workarounds to resolve this exception in scala.
https://issues.apache.org/jira/browse/FLINK-6500
But I already provide the TypeInformation when deserializing json object,
and still see this exception.
Is there anything I ignore?
The sample code
https://gist.github.com
10 matches
Mail list logo