>
> 3. In our usecase we read from Kafka, do some mapping and lastly persists
> data to cassandra as well as pushes the data over remote actor for
> realtime update in dashboard. I used below approaches
> - First tried to use vary naive way like stream.map(...)*.foreachRDD(
> pushes to actor)*
>
Hi,
I have some questions regarding usage patterns and debugging in spark/spark
streaming.
1. What is some used design patterns of using broadcast variable? In my
application i created some and also created a scheduled task which
periodically refreshes the variables. I want to know how efficientl