the table into
>> textformat and directly reading the hdfs works without any problem and with
>> a lot of free mem…
>>
>>
>>
>> Greetings,
>>
>> Arnaud
>>
>>
>>
>> *De :* LINZ, Arnaud
>> *Envoyé :* jeudi 12 novembre 2
rectly reading the hdfs works without any problem and with
> a lot of free mem…
>
>
>
> Greetings,
>
> Arnaud
>
>
>
> *De :* LINZ, Arnaud
> *Envoyé :* jeudi 12 novembre 2015 17:48
> *À :* 'user@flink.apache.org'
> *Objet :* Join Stream with big ref
problem and with a lot of free mem…
Greetings,
Arnaud
De : LINZ, Arnaud
Envoyé : jeudi 12 novembre 2015 17:48
À : 'user@flink.apache.org'
Objet : Join Stream with big ref table
Hello,
I have to enrich a stream with a big reference table (11,000,000 rows). I
cannot use “join” because I can
Hello,
I have to enrich a stream with a big reference table (11,000,000 rows). I
cannot use “join” because I cannot window the stream ; so in the “open()”
function of each mapper I read the content of the table and put it in a HashMap
(stored on the heap).
11M rows is quite big but it should t