last processed
> sequence number.
>
>
>
> I am happy to take a deeper look if you can provide more
> information/logs/code.
>
>
>
> Thanks,
>
>
>
> *From: *Ying Xu
> *Date: *Monday, 14 September 2020 at 19:48
> *To: *Andrey Zagrebin
> *Cc: *J
And I suspect I have throttled by DynamoDB stream, I contacted AWS support
but got no response except for increasing WCU and RCU.
Is it possible that Flink will lose exactly-once semantics when throttled?
On Thu, Sep 10, 2020 at 10:31 PM Jiawei Wu
wrote:
> Hi Andrey,
>
> Thanks
t bugfix releases.
> I will cc Ying Xu who might have a better idea about the DinamoDB source.
>
> Best,
> Andrey
>
> On Thu, Sep 10, 2020 at 3:10 PM Jiawei Wu
> wrote:
>
>> Hi,
>>
>> I'm using AWS kinesis analytics application with Flink 1.8. I am usin
Hi,
I'm using AWS kinesis analytics application with Flink 1.8. I am using
the FlinkDynamoDBStreamsConsumer to consume DynamoDB stream records. But
recently I found my internal state is wrong.
After I printed some logs I found some DynamoDB stream record are skipped
and not consumed by Flink. May
I have posted this question in StackOverflow:
https://stackoverflow.com/questions/61334549/flink-richsinkfunction-constructor-vs-open
The question is:
> Let's say I need to implemnt a custom sink using RichSinkFunction, and I
need some variables like DBConnection in the sink. Where should I
initia
Hi,
I have a quick question about the "EventTimeTrigger". I notice it's based
on TimeWindow instead of Window. Is there any reason why this cannot apply
to GlobalWindow?
Thanks,
Jiawei
ses, just go lambda. You will not gain
>> much with Flink, especially if you already have the experience.
>> If you know your application will grow out of these use cases or is more
>> complex to begin with, consider Flink.
>>
>> There is also one relatively new techno
> inventory like inbounded 17 days ago, and there is no new events coming
> about that inventory,
> then the calculation would not be triggered and you can't sum it, right?
>
> Best,
> Kurt
>
>
> On Wed, Mar 11, 2020 at 10:06 AM Jiawei Wu
> wrote:
>
>> Hi Ro
/java/org/apache/flink/streaming/connectors/kinesis/examples/ConsumeFromDynamoDBStreams.java
> Here's also some info: https://issues.apache.org/jira/browse/FLINK-4582
>
> For writing to DynamoDB there is currently no official sink in Flink. It
> should be fairly straightforward t
Hi flink users,
We have a problem and think flink may be a good solution for that. But I'm
new to flink and hope can get some insights from flink community :)
Here is the problem. Suppose we have a DynamoDB table which store the
inventory data, the schema is like:
* vendorId (primary key)
* inve
10 matches
Mail list logo