You do not need recent versions of spark, kafka, or structured
streaming in order to do this. Normal DStreams are sufficient.
You can parallelize your static data from the database to an RDD, and
there's a join method available on RDDs. Transforming a single given
timestamp line into multiple li
Hi Daniela,
This is trivial with Structured Streaming. If your Kafka cluster is 0.10.0
or above, you may use Spark 2.0.2 to create a Streaming DataFrame from
Kafka, and then also create a DataFrame using the JDBC connection, and you
may join those. In Spark 2.1, there's support for a function call
Hi
I have some questions regarding Spark Streaming.
I receive a stream of JSON messages from Kafka.
The messages consist of a timestamp and an ID.
timestamp ID
2016-12-06 13:00 1
2016-12-06 13:40 5
...
In a database I have values for each ID:
ID m