Re: How does Flink handle shorted lived keyed streams

2020-12-24 Thread Xintong Song
I believe what you are looking for is the State TTL [1][2]. Thank you~ Xintong Song [1] https://ci.apache.org/projects/flink/flink-docs-stable/dev/stream/state/state.html#state-time-to-live-ttl [2] https://ci.apache.org/projects/flink/flink-docs-stabledev/table/config.html#table-exec-state-tt

RE: RE: checkpointing seems to be throttled.

2020-12-24 Thread Colletta, Edward
FYI, this was an EFS issue. I originally dismissed EFS being the issue because the Percent I/O limit metric was very low. But I later noticed the throughput utilization was very high. We increased the provisioned throughput and the checkpoint times are greatly reduced. From: Colletta, Edwar

Re: How does Flink handle shorted lived keyed streams

2020-12-24 Thread narasimha
Thanks Xintong. I'll check it out and get back to you. On Thu, Dec 24, 2020 at 1:30 PM Xintong Song wrote: > I believe what you are looking for is the State TTL [1][2]. > > > Thank you~ > > Xintong Song > > > [1] > https://ci.apache.org/projects/flink/flink-docs-stable/dev/stream/state/state.ht

Registering RawType with LegacyTypeInfo via Flink CREATE TABLE DDL

2020-12-24 Thread Yuval Itzchakov
Hi, I have a UDF which returns a type of MAP')>. When I try to register this type with Flink via the CREATE TABLE DDL, I encounter an exception: - SQL parse failed. Encountered "(" at line 2, column 256. Was expecting one of: "NOT" ... "NULL" ... ">" ... "MULTISET" ... "ARRAY"

Re: Registering RawType with LegacyTypeInfo via Flink CREATE TABLE DDL

2020-12-24 Thread Yuval Itzchakov
An expansion to my question: What I really want is for the UDF to return `RAW(io.circe.Json, ?)` type, but I have to do a conversion between Table and DataStream, and TypeConversions.fromDataTypeToLegacyInfo cannot convert a plain RAW type back to TypeInformation. On Thu, Dec 24, 2020 at 12:59 PM

StreamingFileSink closed file exception

2020-12-24 Thread Billy Bain
I am new to Flink and am trying to process a file and write it out formatted as JSON. This is a much simplified version. public class AndroidReader { public static void main(String[] args) throws Exception { final StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecuti

Flink+DDL读取kafka没有输出信息,但是kafka消费端有信息

2020-12-24 Thread Appleyuchi
是Flink1.12的,kafka消费端能读取到数据,但是下面的代码无法读取到数据,运行后没有报错也没有输出,求助,谢谢 import org.apache.flink.streaming.api.scala._ import org.apache.flink.table.api.{EnvironmentSettings, Table} import org.apache.flink.table.api.bridge.scala.StreamTableEnvironment import org.apache.flink.types.Row import org.apache.flin

Realtime Data processing from HBase

2020-12-24 Thread s_penakalap...@yahoo.com
Hi Team, I recently encountered one usecase in my project as described below: My data source is HBaseWe receive huge volume of data at very high speed to HBase tables from source system.Need to read from HBase, perform computation and insert to postgreSQL. I would like few inputs on the below poi

FileSink class in 1.12?

2020-12-24 Thread Billy Bain
I can't seem to find the org.apache.flink.connector.file.sink.FileSink class. I can find the StreamingFileSink, but not FileSink referenced here: https://ci.apache.org/projects/flink/flink-docs-release-1.12/dev/connectors/file_sink.html Am I missing a dependency? compile group: 'org.apache.f

Re: FileSink class in 1.12?

2020-12-24 Thread Billy Bain
Of course I found it shortly after submitting my query. compile group: 'org.apache.flink', name: 'flink-connector-files', version: '1.12.0' On 2020/12/24 15:57:20, Billy Bain wrote: > I can't seem to find the org.apache.flink.connector.file.sink.FileSink > class. > > I can find the Streaming

Re: Issue in WordCount Example with DataStream API in BATCH RuntimeExecutionMode

2020-12-24 Thread Aljoscha Krettek
Thanks for reporting this! This is not the expected behaviour, I created a Jira Issue: https://issues.apache.org/jira/browse/FLINK-20764. Best, Aljoscha On 23.12.20 22:26, David Anderson wrote: I did a little experiment, and I was able to reproduce this if I use the sum aggregator on KeyedStre

Re: StreamingFileSink closed file exception

2020-12-24 Thread Yun Gao
Hi Billy, StreamingFileSink does not expect the Encoder to close the stream passed in in encode method. However, ObjectMapper would close it at the end of the write method. Thus I think you think disable the close action for ObjectMapper, or change the encode implementation to objectMapper

Re: [DISCUSS] FLIP-147: Support Checkpoints After Tasks Finished

2020-12-24 Thread Yun Gao
Hi all, I tested the previous PoC with the current tests and I found some new issues that might cause divergence, and sorry for there might also be some reversal for some previous problems: 1. Which operators should wait for one more checkpoint before close ? One motiv