Re: Collector.collect

2017-05-02 Thread Chesnay Schepler
:* Monday, May 01, 2017 12:56 PM *To:* Newport, Billy [Tech]; 'user@flink.apache.org' *Subject:* Re: Collector.collect Oh you have multiple different output formats, missed that. For the Batch API you are i believe correct, using a custom output-format is the best solution. In the Strea

RE: Collector.collect

2017-05-02 Thread Newport, Billy
, May 01, 2017 12:56 PM To: Newport, Billy [Tech]; 'user@flink.apache.org' Subject: Re: Collector.collect Oh you have multiple different output formats, missed that. For the Batch API you are i believe correct, using a custom output-format is the best solution. In the Streaming API the

Re: Collector.collect

2017-05-01 Thread Chesnay Schepler
17 10:41 AM *To:* user@flink.apache.org *Subject:* Re: Collector.collect Hello, @Billy, what prevented you from duplicating/splitting the record, based on the bitmask, in a map function before the sink? This shouldn't incur any serialization overhead if the sink is chained to the map. T

RE: Collector.collect

2017-05-01 Thread Newport, Billy
approaches? From: Chesnay Schepler [mailto:ches...@apache.org] Sent: Monday, May 01, 2017 10:41 AM To: user@flink.apache.org Subject: Re: Collector.collect Hello, @Billy, what prevented you from duplicating/splitting the record, based on the bitmask, in a map function before the sink? This shouldn&#

Re: Collector.collect

2017-05-01 Thread Chesnay Schepler
ch faster. We’re finding a need to do this kind of optimization pretty frequently with flink. *From:*Gaurav Khandelwal [mailto:gaurav671...@gmail.com] *Sent:* Saturday, April 29, 2017 4:32 AM *To:* user@flink.apache.org *Subject:* Collector.collect Hello I am working on RichProcessFunction and

RE: Collector.collect

2017-05-01 Thread Newport, Billy
Subject: Collector.collect Hello I am working on RichProcessFunction and I want to emit multiple records at a time. To achieve this, I am currently doing : while(condition) { Collector.collect(new Tuple<>...); } I was wondering, is this the correct way or there is any other alternative.

Collector.collect

2017-04-29 Thread Gaurav Khandelwal
Hello I am working on RichProcessFunction and I want to emit multiple records at a time. To achieve this, I am currently doing : while(condition) { Collector.collect(new Tuple<>...); } I was wondering, is this the correct way or there is any other alternative.

Re: Exceptions from collector.collect after cancelling job

2016-09-30 Thread Shannon Carey
My flat map function is catching & logging the exception. The try block happens to encompass the call to Collector#collect(). I will move the call to collect outside of the try. That should silence the log message. On 9/30/16, 3:51 AM, "Ufuk Celebi" wrote: >On Thu, Sep 29, 2016 at 9:29 PM,

Re: Exceptions from collector.collect after cancelling job

2016-09-30 Thread Ufuk Celebi
On Thu, Sep 29, 2016 at 9:29 PM, Shannon Carey wrote: > It looks like Flink is disabling the objects that the FlatMap collector > relies on before disabling the operator itself. Is that expected/normal? Is > there anything I should change in my FlatMap function or job code to account > for it? He

Exceptions from collector.collect after cancelling job

2016-09-29 Thread Shannon Carey
When I cancel a job, I get many exceptions that look like this: java.lang.RuntimeException: Could not forward element to next operator at org.apache.flink.streaming.runtime.tasks.OperatorChain$CopyingChainingOutput.collect(OperatorChain.java:376) at org.apache.flink.streaming.runtime.tasks.Opera