Hi Sathi,
I believe the issue is because you pushed the event into the stream and
then you started up a consumer app to start reading after that. If you push
an event into the kinesis stream prior to starting up a reader that sets
its initial stream position to LATEST, it will not read that record
hould not be
> read multiple times by the different parallel instances.
> How did you check / find out that each node is reading all the data?
>
> Regards,
> Robert
>
> On Tue, Nov 22, 2016 at 7:42 PM, Alex Reid
> wrote:
>
>> Hi, I've been playing around with using a
Hi, I've been playing around with using apache flink to process some data,
and I'm starting out using the batch DataSet API.
To start, I read in some data from files in an S3 folder:
DataSet records = env.readTextFile("s3://my-s3-bucket/some-folder/");
Within the folder, there are 20 gzipped fi