Support for daylight saving timezone changes in Flink

2017-02-13 Thread Swapnil Chougule
I want to know the behavior of flink streaming systems during daylight saving changes in multiple timezones. As streaming systems may in such timezones. Is there any built-in support is needed ? Can anybody answer ? Thanks in advance --Swapnil

Re: Tumbling window rich functionality

2016-10-12 Thread Swapnil Chougule
h window so you should > be able to do the setup/teardown in there. open() and close() are called at > the start of processing, end of processing, respectively. > > Cheers, > Aljoscha > > On Wed, 14 Sep 2016 at 09:04 Swapnil Chougule > wrote: > >> Hi Team, >&

Re: RawSchema as deserialization schema

2016-09-26 Thread Swapnil Chougule
rraySerializer" which is doing exactly the same as the "RawSchema" > > > On Mon, Sep 26, 2016 at 11:27 AM, Swapnil Chougule < > the.swapni...@gmail.com> wrote: > >> Hi Robert, May I know your inputs on same ? >> >> Thanks, >> Swapnil >

Re: RawSchema as deserialization schema

2016-09-26 Thread Swapnil Chougule
Hi Robert, May I know your inputs on same ? Thanks, Swapnil On Thu, Sep 22, 2016 at 7:12 PM, Stephan Ewen wrote: > /cc Robert, he is looking into extending the Kafka Connectors to support > more of Kafka's direct utilities > > On Thu, Sep 22, 2016 at 3:17 PM, Swapnil Chougule

Re: Rich Window Function - When does close(tear down) method executes ?

2016-09-22 Thread Swapnil Chougule
Can I get any update please ? Regards, Swapnil

Rich Window Function - When does close(tear down) method executes ?

2016-09-22 Thread Swapnil Chougule
I am using rich window function in my streaming project. I want "close" method to get triggered after each window interval. In my case, open gets executed life time once & close method doesn't get executed ? Can anybody help to sort out same ? I want tear down method after each window interval. T

Re: RawSchema as deserialization schema

2016-09-22 Thread Swapnil Chougule
It will be good to have RawSchema as one of the deserialization schema in streaming framework (like SimpleStringSchema). Many use cases needs data in byte array format after reading from source like kafka. Any inputs for same ? On Mon, Sep 12, 2016 at 11:42 AM, Swapnil Chougule wrote: > Tha

Re: Flink JDBCOutputFormat logs wrong WARN message

2016-09-22 Thread Swapnil Chougule
Thanks Marton !! On Thu, Sep 22, 2016 at 4:36 PM, Márton Balassi wrote: > Done. Go ahead, Swapnil. > > Best, > Marton > > On Thu, Sep 22, 2016 at 1:03 PM, Swapnil Chougule > wrote: > >> Hi Fabian/ Chesnay >> Can anybody give me permission to assign JI

Re: Flink JDBCOutputFormat logs wrong WARN message

2016-09-22 Thread Swapnil Chougule
Hi Fabian/ Chesnay Can anybody give me permission to assign JIRA (created for same.)? Thanks, Swapnil On Tue, Sep 20, 2016 at 6:18 PM, Swapnil Chougule wrote: > Thanks Chesnay & Fabian for update. > I will create JIRA issue & open a pull request to fix it. > > Thanks, >

Re: Flink JDBCOutputFormat logs wrong WARN message

2016-09-20 Thread Swapnil Chougule
quest to > fix it? > > Thanks, Fabian > > 2016-09-20 11:22 GMT+02:00 Chesnay Schepler : > >> I would agree that the condition should be changed. >> >> >> On 20.09.2016 10:52, Swapnil Chougule wrote: >> >>> I checked following code in Flink JD

Flink JDBCOutputFormat logs wrong WARN message

2016-09-20 Thread Swapnil Chougule
I checked following code in Flink JDBCOutputFormat while I was using in my project work. I found following snippet: @Override public void writeRecord(Row row) throws IOException { if (typesArray != null && typesArray.length > 0 && typesArray.length == row.productArity()) {

Tumbling window rich functionality

2016-09-14 Thread Swapnil Chougule
Hi Team, I am using tumbling window functionality having window size 5 minutes. I want to perform setup & teardown functionality for each window. I tried using RichWindowFunction but it didn't work for me. Can anybody tell me how can I do it ? Attaching code snippet what I tried impressions.map(

Re: Flink JDBC JDBCOutputFormat Open

2016-09-13 Thread Swapnil Chougule
Thanks Chesnay for update. On Tue, Sep 13, 2016 at 12:13 AM, Chesnay Schepler wrote: > Hello, > > the JDBC Sink completely ignores the taskNumber and parallelism. > > Regards, > Chesnay > > > On 12.09.2016 08:41, Swapnil Chougule wrote: > > Hi Team, > > I

Flink JDBC JDBCOutputFormat Open

2016-09-11 Thread Swapnil Chougule
Hi Team, I want to know how tasknumber & numtasks help in opening db connection in Flink JDBC JDBCOutputFormat Open. I checked with docs where it says: taskNumber - The number of the parallel instance.numTasks - The number of parallel tasks.But couldn't get clear idea among parallel instance & par

Re: RawSchema as deserialization schema

2016-09-11 Thread Swapnil Chougule
Byte array serialization poses no problem to the Flink > serialization. > > On Mon, Sep 5, 2016 at 3:50 PM, Swapnil Chougule > wrote: > > I am using Kafka consumer in flink 1.1.1 with Kafka 0.8.2. I want to read > > byte array as it is in datastream. I tried to use RawSchema as >

Re: Distributed Cache support for StreamExecutionEnvironment

2016-09-11 Thread Swapnil Chougule
ound, you can rely on the RichFunction's open() > method's to load such data directly from a distributed file system. > > Regards, > Robert > > On Fri, Sep 9, 2016 at 8:13 AM, Swapnil Chougule > wrote: > >> Hi Team, >> >> Is there support for Distrib

Distributed Cache support for StreamExecutionEnvironment

2016-09-08 Thread Swapnil Chougule
Hi Team, Is there support for Distributed Cache in StreamExecutionEnvironment? I didn't find any such things in StreamExecutionEnvironment. I am using flink 1.1.1 I found distributed cache for ExecutionEnvironment but not for StreamExecutionEnvironment If Yes, Can anybody tell me how to use same

RawSchema as deserialization schema

2016-09-05 Thread Swapnil Chougule
I am using Kafka consumer in flink 1.1.1 with Kafka 0.8.2. I want to read byte array as it is in datastream. I tried to use RawSchema as desrialization schema but couldn't find same 1.1.1. I want to know whether I have to write my custom implementation for same ? Can somebody help me to sort out s