Thank you for reply.
I checked WEB UI and found that the total number of tasks is 10.
So, I changed the number of cores from 1 to 10, then it works well.
But I haven't figure out what is happening.
My assumption is that each Job consists of 10 tasks in default and each
task occupies 1 core.
S
One possible case is you don't have enough resources to launch all tasks
for your continuous processing query. Could you check the Spark UI and see
if all tasks are running rather than waiting for resources?
Best Regards,
Shixiong Zhu
Databricks Inc.
shixi...@databricks.com
databricks.com
[image