I want to know the behavior of flink streaming systems during daylight
saving changes in multiple timezones. As streaming systems may in such
timezones.
Is there any built-in support is needed ? Can anybody answer ?
Thanks in advance
--Swapnil
h window so you should
> be able to do the setup/teardown in there. open() and close() are called at
> the start of processing, end of processing, respectively.
>
> Cheers,
> Aljoscha
>
> On Wed, 14 Sep 2016 at 09:04 Swapnil Chougule
> wrote:
>
>> Hi Team,
>&
rraySerializer" which is doing exactly the same as the "RawSchema"
>
>
> On Mon, Sep 26, 2016 at 11:27 AM, Swapnil Chougule <
> the.swapni...@gmail.com> wrote:
>
>> Hi Robert, May I know your inputs on same ?
>>
>> Thanks,
>> Swapnil
>
Hi Robert, May I know your inputs on same ?
Thanks,
Swapnil
On Thu, Sep 22, 2016 at 7:12 PM, Stephan Ewen wrote:
> /cc Robert, he is looking into extending the Kafka Connectors to support
> more of Kafka's direct utilities
>
> On Thu, Sep 22, 2016 at 3:17 PM, Swapnil Chougule
Can I get any update please ?
Regards,
Swapnil
I am using rich window function in my streaming project. I want "close"
method to get triggered after each window interval.
In my case, open gets executed life time once & close method doesn't get
executed ?
Can anybody help to sort out same ? I want tear down method after each
window interval.
T
It will be good to have RawSchema as one of the deserialization schema in
streaming framework (like SimpleStringSchema).
Many use cases needs data in byte array format after reading from source
like kafka.
Any inputs for same ?
On Mon, Sep 12, 2016 at 11:42 AM, Swapnil Chougule
wrote:
> Tha
Thanks Marton !!
On Thu, Sep 22, 2016 at 4:36 PM, Márton Balassi
wrote:
> Done. Go ahead, Swapnil.
>
> Best,
> Marton
>
> On Thu, Sep 22, 2016 at 1:03 PM, Swapnil Chougule > wrote:
>
>> Hi Fabian/ Chesnay
>> Can anybody give me permission to assign JI
Hi Fabian/ Chesnay
Can anybody give me permission to assign JIRA (created for same.)?
Thanks,
Swapnil
On Tue, Sep 20, 2016 at 6:18 PM, Swapnil Chougule
wrote:
> Thanks Chesnay & Fabian for update.
> I will create JIRA issue & open a pull request to fix it.
>
> Thanks,
>
quest to
> fix it?
>
> Thanks, Fabian
>
> 2016-09-20 11:22 GMT+02:00 Chesnay Schepler :
>
>> I would agree that the condition should be changed.
>>
>>
>> On 20.09.2016 10:52, Swapnil Chougule wrote:
>>
>>> I checked following code in Flink JD
I checked following code in Flink JDBCOutputFormat while I was using in my
project work. I found following snippet:
@Override
public void writeRecord(Row row) throws IOException {
if (typesArray != null && typesArray.length > 0 &&
typesArray.length == row.productArity()) {
Hi Team,
I am using tumbling window functionality having window size 5 minutes.
I want to perform setup & teardown functionality for each window. I tried
using RichWindowFunction but it didn't work for me.
Can anybody tell me how can I do it ?
Attaching code snippet what I tried
impressions.map(
Thanks Chesnay for update.
On Tue, Sep 13, 2016 at 12:13 AM, Chesnay Schepler
wrote:
> Hello,
>
> the JDBC Sink completely ignores the taskNumber and parallelism.
>
> Regards,
> Chesnay
>
>
> On 12.09.2016 08:41, Swapnil Chougule wrote:
>
> Hi Team,
>
> I
Hi Team,
I want to know how tasknumber & numtasks help in opening db connection in
Flink JDBC JDBCOutputFormat Open. I checked with docs where it says:
taskNumber - The number of the parallel instance.numTasks - The number of
parallel tasks.But couldn't get clear idea among parallel instance &
par
Byte array serialization poses no problem to the Flink
> serialization.
>
> On Mon, Sep 5, 2016 at 3:50 PM, Swapnil Chougule
> wrote:
> > I am using Kafka consumer in flink 1.1.1 with Kafka 0.8.2. I want to read
> > byte array as it is in datastream. I tried to use RawSchema as
>
ound, you can rely on the RichFunction's open()
> method's to load such data directly from a distributed file system.
>
> Regards,
> Robert
>
> On Fri, Sep 9, 2016 at 8:13 AM, Swapnil Chougule
> wrote:
>
>> Hi Team,
>>
>> Is there support for Distrib
Hi Team,
Is there support for Distributed Cache in StreamExecutionEnvironment? I
didn't find any such things in StreamExecutionEnvironment. I am using flink
1.1.1
I found distributed cache for ExecutionEnvironment but not for
StreamExecutionEnvironment
If Yes, Can anybody tell me how to use same
I am using Kafka consumer in flink 1.1.1 with Kafka 0.8.2. I want to read
byte array as it is in datastream. I tried to use RawSchema as
desrialization schema but couldn't find same 1.1.1.
I want to know whether I have to write my custom implementation for same ?
Can somebody help me to sort out s
18 matches
Mail list logo