Re: Record has Long.MIN_VALUE timestamp (= no timestamp marker). Is the time characteristic set to 'ProcessingTime', or did you forget to call 'DataStream.assignTimestampsAndWatermarks(...)'?

2016-07-10 Thread David Olsen
10:32 AM, Kostas Kloudas >> mailto:k.klou...@data-artisans.com>> wrote: >> Can it be that when you define the ‘right’ steam, you do not specify a >> timestamp extractor? >> This is done the same way you do it for the ‘left’ stream. >> >> Kostas >> &g

Re: Record has Long.MIN_VALUE timestamp (= no timestamp marker). Is the time characteristic set to 'ProcessingTime', or did you forget to call 'DataStream.assignTimestampsAndWatermarks(...)'?

2016-07-07 Thread David Olsen
to > eventTime and leave the rest of your code as is. > > Let me know if this answered your question. > > Cheers, > Kostas > >> On Jul 6, 2016, at 3:43 PM, David Olsen wrote: >> >> I have two streams. One will produce a single record, and the other >&

Record has Long.MIN_VALUE timestamp (= no timestamp marker). Is the time characteristic set to 'ProcessingTime', or did you forget to call 'DataStream.assignTimestampsAndWatermarks(...)'?

2016-07-06 Thread David Olsen
I have two streams. One will produce a single record, and the other have a list of records. And I want to do left join. So for example, Stream A: record1 record2 ... Stream B: single-record After joined, record1, single-record record2, single-record ... However with the following streaming job

Re: java.io.IOException: Couldn't access resultSet

2016-06-06 Thread David Olsen
r the batch API in the >> streaming API. >> >> While that should actually work, it is a somewhat new and less tested >> function. Let's double check that the call to open() is properly forwarded. >> >> >> On Sun, Jun 5, 2016 at 12:47 PM, David Olsen >>

Re: java.io.IOException: Couldn't access resultSet

2016-06-05 Thread David Olsen
before StreamExecutionEnvironment creates jdbc input? Thanks On 5 June 2016 at 18:26, David Olsen wrote: > I remove the open method when constructing jdbc input format, but I still > obtain "couldn't access resultSet" error. > > Caused by: java.io.IOException: Cou

Re: java.io.IOException: Couldn't access resultSet

2016-06-05 Thread David Olsen
lPointerException at org.apache.flink.api.java.io.jdbc.JDBCInputFormat.nextRecord(JDBCInputFormat.java:164) ... 7 more Anything I should check as well? Thanks On 5 June 2016 at 17:26, Chesnay Schepler wrote: > you are not supposed to call open yourselves. > > > On 05.06.2016 11:05, David Olsen wrote: > >>

java.io.IOException: Couldn't access resultSet

2016-06-05 Thread David Olsen
Following the sample on the flink website[1] to test jdbc I encountered an error "Couldn't access resultSet". It looks like the nextRecord is called before open() function. However I've called open() when I construct jdbc input format. Any functions I should call before job submission? def jdbc()=

Re: Parallel read text

2016-05-28 Thread David Olsen
/main/java/org/apache/flink/streaming/api/functions/source/FileSourceFunction.java On 28 May 2016 at 17:52, Chesnay Schepler wrote: > ExecutionEnvironment.readTextFile will read the file in parallel. > > > On 28.05.2016 09:59, David Olsen wrote: > > After searching on the i

Parallel read text

2016-05-28 Thread David Olsen
After searching on the internet I still do not find the answer (with key word like 'apache flink parallel read text') I am looking for. So asking here before jumping to write code ... My problem is I want to a read text file or split text files (from local file system). Therefore I want to paralle