Just don't call .awaitTermindation() because it blocks execution of the
next line of code. You can assign result of .start() to a specific
variable, or put them into list/array.
And to wait until one of the streams finishes, use
spark.streams.awaitAnyTermination() or something like this
(https://s
Hello,
I have a structured streaming job that needs to be able to write to
multiple sinks. We are using *Continuous* Trigger *and not* *Microbatch*
Trigger.
1. When we use the foreach method using:
*dataset1.writeStream.foreach(kafka ForEachWriter
logic).trigger(ContinuousMode).start().awaitTermi