hich they are
> called depends on what data is available.
>
> In general, it is not possible to share local operator state among
> different operators (or even parallel instance of the same operator).
>
> Hope this helps,
> Fabian
--
View this message in context:
ht
Hi Philipp,
If I got your requirements right you would like to:
1) load an initial hashmap via JDBC
2) update the hashmap from a stream
3) use the hashmap to enrich another stream.
You can use a CoFlatMap to do this:
stream1.connect(stream2).flatMap(new YourCoFlatMapFunction).
YourCoFlatMapFunc
the two functions (I am using the
datastreaming API) ?
Thanks
Philipp
--
View this message in context:
http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/Enriching-a-tuple-mapped-from-a-datastream-with-data-coming-from-a-JDBC-source-tp8993p9299.html
Sent from the Apache Flink
BCInput concept which one could use with the DataSet API
> and was wondering if I could use that somehow in my open method then ?
>
> Thanks
> Philipp
>
>
>
> --
> View this message in context:
> http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/En
wondering if I could use that somehow in my open method then ?
Thanks
Philipp
--
View this message in context:
http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/Enriching-a-tuple-mapped-from-a-datastream-with-data-coming-from-a-JDBC-source-tp8993p9002.html
Sent from the Apache
Hi Philipp,
the easist way is a RichMap. In the open()-Method you can load the
relevant database table into memory (e.g. a HashMap). In the
map()-method you than just look up the entry in the HashMap.
Of course, this only works if the dataset is small enough to fit in
memory. Is it?
Cheers,
Kon
Hi there,
I have a data stream (coming from Kafka) that contains information which I
want to enrich with information that sits in a database before I handover
the enriched tuple to a sink.
How would I do that ?
I was thinking of somehow combining my streaming job with a JDBC input but
wasn't very s