between the actual event data to be processed and
metadata coming in second stream
Sathi
From: Till Rohrmann
Reply-To: "user@flink.apache.org"
Date: Thursday, February 2, 2017 at 2:10 PM
To: "user@flink.apache.org"
Subject: Re: broadcasting a stream from a collection that
Hi Sathi,
I would ingest the meta data also into a kinesis queue and read the data
from there. Then you don't have to fiddle around with the rest API from
within your Flink job.
If that is not feasible for you, then you can also write your own custom
source function which queries the REST endpoin
It’s good to be here. I have a data stream coming from kinesis. I also have a
list of hashmap which holds metadata that needs to participate in the
processing.
In my flink processor class I construct this metadata (hardcoded)
public static void main(String[] args) throws Exception {
…….//
Dat