Thanks for the help. I set --executor-cores and it works now. I've used
--total-executor-cores and don't realize it changed.
Tathagata Das 于2015年7月10日周五 上午3:11写道:
> 1. There will be a long running job with description "start()" as that is
> the jobs that is running the receivers. It will never e
1. There will be a long running job with description "start()" as that is
the jobs that is running the receivers. It will never end.
2. You need to set the number of cores given to the Spark executors by the
YARN container. That is SparkConf spark.executor.cores, --executor-cores
in spark-submit.
Do you have enough cores in the configured number of executors in YARN?
On Thu, Jul 9, 2015 at 2:29 AM, Bin Wang wrote:
> I'm using spark streaming with Kafka, and submit it to YARN cluster with
> mode "yarn-cluster". But it hangs at SparkContext.start(). The Kafka config
> is right since it c
I'm using spark streaming with Kafka, and submit it to YARN cluster with
mode "yarn-cluster". But it hangs at SparkContext.start(). The Kafka config
is right since it can show some events in "Streaming" tab of web UI.
The attached file is the screen shot of the "Jobs" tab of web UI. The code
in th