Timestamp conversion problem in Flink Table/SQL

2018-12-27 Thread jia yichao
Hi community, Recently I have encountered a problem with time conversion in Flink Table/SQL . When the processed field contains a timestamp type, the code of the flink table codegen first converts the timestamp type to a long type, and then converts the long type to a timestamp type on output.

Access Flink configuration in user functions

2018-12-27 Thread Paul Lam
Hi to all, I would like to use a custom RocksDBStateBackend which uses the default checkpoint dir in Flink configuration, but I failed to find a way to access Flink configuration in the user code. So I wonder is it possible to retrieve Flink configurations (not user-defined global parameters) a

Flink SQL client always cost 2 minutes to submit job to a local cluster

2018-12-27 Thread yinhua.dai
I am using Flink 1.6.1, I tried to use flink sql client with some own jars with --jar and --library. It can work to execute sql query, however it always cause around 2 minutes to submit the job the local cluster, but when I copy my jar to flink lib, and remove --jar and --library parameter, it can

Re: Kafka consumer, is there a way to filter out messages using key only?

2018-12-27 Thread Hao Sun
Cool, thanks! I used the option value approach, worked well. On Thu, Dec 27, 2018, 03:49 Dominik Wosiński wrote: > Hey, > AFAIK, returning null from deserialize function in FlinkKafkaConsumer will > indeed filter the message out and it won't be further processed. > > Best Regards, > Dom. > > śr.

Re: Row format or bulk format

2018-12-27 Thread Andrey Zagrebin
Hi Taher, The row format is used to append data to part files record by record. When you implement Encoder interface, you can decide how to serialise record into bytes and write the bytes into the OutputStream of current part file. At the moment I do not see any facility classes in Flink code base

Re: Kafka consumer, is there a way to filter out messages using key only?

2018-12-27 Thread Dominik Wosiński
Hey, AFAIK, returning null from deserialize function in FlinkKafkaConsumer will indeed filter the message out and it won't be further processed. Best Regards, Dom. śr., 19 gru 2018 o 11:06 Dawid Wysakowicz napisał(a): > Hi, > > I'm afraid that there is no out-of-the box solution for this, but w

Row format or bulk format

2018-12-27 Thread Taher Koitawala
Hi All, I am currently working on flink 1.7 with StreamingFileSink and need to write AVRO data on S3 Filesystem and the plan is to move from the old bucketing sink to the new sink which is much more compatible. Application is also using check pointing right now. Can someone pleas