Just don't call .awaitTermindation() because it blocks execution of the
next line of code. You can assign result of .start() to a specific
variable, or put them into list/array.
And to wait until one of the streams finishes, use
spark.streams.awaitAnyTermination() or something like this
(https://s
https://issues.apache.org/jira/browse/SPARK-36722
https://github.com/apache/spark/pull/33968
On 2021/09/11 10:06:50, Bj��rn J��rgensen wrote:
> Hi I am using "from pyspark import pandas as ps" in a master build yesterday.
> I do have some columns that I need to join to one.
> In pandas I u