Hi Miki,

What are those several ways? could you help me with references?

Use case:

We have a continuous credit card transaction stream flowing into a Kafka
topic, along with a set of defaulters of credit card in a .csv file(which
gets updated every day).


Thanks

Srikanth


On Fri, Sep 27, 2019 at 11:11 AM miki haiat <miko5...@gmail.com> wrote:

> I'm sure there is several ways to implement it. Can you elaborate more on
> your use case ?
>
> On Fri, Sep 27, 2019, 08:37 srikanth flink <flink.d...@gmail.com> wrote:
>
>> Hi,
>>
>> My data source is Kafka, all these days have been reading the values from
>> Kafka stream to a table. The table just grows and runs into a heap issue.
>>
>> Came across the eviction policy that works on only keys, right?
>>
>> Have researched to configure the environment file(Flink SLQ) to read both
>> key and value, so as the eviction works on the keys and older data is
>> cleared. I found nothing in the docs, so far.
>>
>> Could someone help with that?
>> If there's no support for reading key and value, can someone help me to
>> assign a key to the table I'm building from stream?
>>
>> Thanks
>> Srikanth
>>
>

Reply via email to