+1 for looking into Kafka Streams. You can build and maintain state within
your app, and expose it on a REST endpoint for your node/react app to
query.
There's an example of this here:
https://github.com/confluentinc/kafka-streams-examples/blob/5.2.1-post/src/main/java/io/confluent/examples/streams/interactivequeries/kafkamusic/KafkaMusicExample.java

Depending on the processing you want to do KSQL could also be useful,
writing the results of queries (e.g. spotting exceptions in the data) to a
Kafka topic that can be queried by your app.


-- 

Robin Moffatt | Developer Advocate | ro...@confluent.io | @rmoff


On Tue, 9 Apr 2019 at 21:26, Nick Torenvliet <natorenvl...@gmail.com> wrote:

> Hi all,
>
> Just looking for some general guidance.
>
> We have a kafka -> druid pipeline we intend to use in an industrial setting
> to monitor process data.
>
> Our kafka system recieves messages on a single topic.
>
> The messages are {"timestamp": yy:mm:ddThh:mm:ss.mmm, "plant_equipment_id":
> "id_string", "sensorvalue": float}
>
> For our POC there are about 2000 unique plant_equipment ids, this will
> quickly grow to 20,000.
>
> The kafka topic streams into druid
>
> We are building some node.js/react browser based apps for analytics and
> real time stream monitoring.
>
> We are thinking that for visualizing historical data sets we will hit druid
> for data.
>
> For real time streaming we are wondering what our best option is.
>
> One option is to just hit druid semi regularly and update the on screen
> visualization as data arrives from there.
>
> Another option is to stream subset of the topics (somehow) from kafka using
> some streams interface.
>
> With all the stock ticker apps out there, I have to imagine this is a
> really common use case.
>
> Anyone have any thoughts as to what we are best to do?
>
> Nick
>

Reply via email to