Hi, 

 I am new to Spark and I would like know how to compute (dynamically)
real-time visualizations using Spark streaming (Kafka).

Use case : We have Real-time analytics dashboard (reports and dashboard),
user can define report (visualization) with certain parameters like, refresh
period, choose various metrics (segment variables & profile variables). 

We should compute only visualizations those are in use (users are accessing)
with events coming from kafka streams using Spark streaming. 

Solution : One way of doing is compute visualizations for every incoming
message and write back into result streams and application which consume the
processed data/result streams. 

I would like to know is there any better approach? Please advice me here.

Thanks,
Suresh



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Compute-Real-time-Visualizations-using-spark-streaming-tp24908.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to