Thank you for reply.
I checked WEB UI and found that the total number of tasks is 10.
So, I changed the number of cores from 1 to 10, then it works well.
But I haven't figure out what is happening.
My assumption is that each Job consists of 10 tasks in default and each
task occupies 1 core.
S
One possible case is you don't have enough resources to launch all tasks
for your continuous processing query. Could you check the Spark UI and see
if all tasks are running rather than waiting for resources?
Best Regards,
Shixiong Zhu
Databricks Inc.
shixi...@databricks.com
databricks.com
[image
Hi all
Now I am using Structured Streaming in Continuous Processing mode and I
faced a odd problem.
My code is so simple that it is similar to the sample code on the
documentation.
https://spark.apache.org/docs/latest/structured-streaming-programming-guide.html#continuous-processing
When I