Re: Tuple vs Row

2016-10-07 Thread Timo Walther
Hi Flavio, I have never benchmarked Rows but they definitely add some overhead. Each new row also creates an array object storing the values, the serializer maintains a bitmask for tagging null values and the comparator has additional checks for null values. Would be interesting to measure t

Re: Listening to timed-out patterns in Flink CEP

2016-10-07 Thread lgfmt
The following is a better link: http://mail-archives.apache.org/mod_mbox/flink-user/201609.mbox/%3CCAC27z%3DOTtv7USYUm82bE43-DkoGfVC4UAWD6uQwwRgTsE5be8g%40mail.gmail.com%3E - LF   From: "lg...@yahoo.com" To: "user@flink.apache.org" Sent: Friday, October 7, 2016 3:36 PM Subject: Re

Re: more complex patterns for CEP (was: CEP two transitions to the same state)

2016-10-07 Thread lgfmt
hi Till, Thanks for the detailed response. I'm looking forward to seeing these features implemented in Flink. Can anyone provide timelines for the 3 tickets that you mentioned in your response?  - LF From: Till Rohrmann To: user@flink.apache.org Sent: Tuesday, September 20, 2016 7:13

Re: Listening to timed-out patterns in Flink CEP

2016-10-07 Thread lgfmt
Isn't the upcoming CEP negation (absence of an event) feature solve this issue? See this discussion thread:http://mail-archives.apache.org/mod_mbox/flink-user/201609.mbox/%3CCAC27z%3DOD%2BTq8twBw_1YKni5sWAU3g1S9WDpJw0DUwgiG9YX9Fg%40mail.gmail.com%3E  //  Atul From: Till Rohrmann To: user

Re: readCsvFile

2016-10-07 Thread Fabian Hueske
I would check that the field delimiter is correctly set. With the correct delimiter your code would give ((a),1) ((aa),1) because the single field is wrapped in a Tuple1. You have to unwrap it in the map function: .map { (_._1, 1) } 2016-10-07 18:08 GMT+02:00 Alberto Ramón : > Humm > > Your so

Re: jdbc.JDBCInputFormat

2016-10-07 Thread Fabian Hueske
As the exception says the class org.apache.flink.api.scala.io.jdbc.JDBCInputFormat does not exist. You have to do: import org.apache.flink.api.java.io.jdbc.JDBCInputFormat There is no Scala implementation of this class but you can also use Java classes in Scala. 2016-10-07 21:38 GMT+02:00 Alber

jdbc.JDBCInputFormat

2016-10-07 Thread Alberto Ramón
I want use CreateInput + buildJDBCInputFormat to acces to database on SCALA PB1: import org.apache.flink.api.scala.io.jdbc.JDBCInputFormat Error:(25, 37) object jdbc is not a member of package org.apache.flink.api.java.io import org.apache.flink.api.java.io.jdbc.JDBCInputFormat Then, I can't use

Re: Data Transfer between TM should be encrypted

2016-10-07 Thread vinay patil
Hi Stephan, https://github.com/apache/flink/pull/2518 Is this pull request going to be part of 1.2 release ? Just wanted to get an idea on timelines so that I can pass on to the team. Regards, Vinay Patil On Thu, Sep 15, 2016 at 11:45 AM, Vijay Srinivasaraghavan < vijikar...@yahoo.com> wrote:

Re: readCsvFile

2016-10-07 Thread Alberto Ramón
Humm Your solution compile with out errors, but IncludedFields Isn't working: [image: Imágenes integradas 1] The output is incorrect: [image: Imágenes integradas 2] The correct result must be only 1º Column (a,1) (aa,1) 2016-10-06 21:37 GMT+02:00 Fabian Hueske : > Hi Alberto, > > if you want t

Re: Compression for AvroOutputFormat

2016-10-07 Thread Kostas Kloudas
Hi Lars, As far as I know there are no plans to do so in the near future, but every contribution is welcome. Looking forward to your Pull Request. Regards, Kostas > On Oct 7, 2016, at 12:40 PM, lars.bachm...@posteo.de wrote: > > Hi, > > at the moment it is not possible to set a compression

Tuple vs Row

2016-10-07 Thread Flavio Pompermaier
Hi to all, is there any performance degradation using Row instead of Tuple objects in Flink? Best, Flavio

Compression for AvroOutputFormat

2016-10-07 Thread lars . bachmann
Hi, at the moment it is not possible to set a compression for the AvroOutputFormat. There is a post in the mailing list from april this year about the same topic but it seams that nothing has happened so far. Are there any plans to add this feature? Otherwise I could contribute this code. R

Re: Listening to timed-out patterns in Flink CEP

2016-10-07 Thread Till Rohrmann
Hi David, in case of event time, the timeout will be detected when the first watermark exceeding the timeout value is received. Thus, it depends a little bit how you generate watermarks (e.g. periodically, watermark per event). In case of processing time, the time is only updated whenever a new e

Re: Merging N parallel/partitioned WindowedStreams together, one-to-one, into a global window stream

2016-10-07 Thread Fabian Hueske
If you are using time windows, you can access the TimeWindow parameter of the WindowFunction.apply() method. The TimeWindow contains the start and end timestamp of a window (as Long) which can act as keys. If you are using count windows, I think you have to use a counter as you described. 2016-1