Re: Flink writeAsCsv

2016-02-04 Thread Fabian Hueske
> custom created evictor. In the window and in the evictor you have access to > all data and you can create specific files for each window triggered > > > > > > > > *From:* Radu Prodan [mailto:raduprod...@gmail.com] > *Sent:* Thursday, February 04, 2016 11:58 AM &

RE: Flink writeAsCsv

2016-02-04 Thread Radu Tudoran
triggered From: Radu Prodan [mailto:raduprod...@gmail.com] Sent: Thursday, February 04, 2016 11:58 AM To: user@flink.apache.org Subject: Re: Flink writeAsCsv Hi Marton, Thanks to your comment I managed to get it worked. At least it outputs the results. However, what I need is to output each

Re: Flink writeAsCsv

2016-02-04 Thread Radu Prodan
Hi Marton, Thanks to your comment I managed to get it worked. At least it outputs the results. However, what I need is to output each window result seperately. Now, it outputs the results of parallel working windows (I think) and appends the new results to them. For example, If I have parallelism

Re: Flink writeAsCsv

2016-02-04 Thread Márton Balassi
Hey Radu, As you are using the streaming api I assume that you call env.execute() in both cases. Is that the case? Do you see any errors appearing? My first call would be if your data type is not a tuple type then writeAsCsv does not work by default. Best, Marton On Thu, Feb 4, 2016 at 11:36 A