I have figured out the problem here. Turned out that there was a problem
with my SparkConf when I was running my application with yarn in cluster
mode. I was setting my master to be local[4] inside my application, whereas
I was setting it to yarn-cluster with spark-submit. Now I have changed my
Spa
Thanks Akhil for your input.
I have already tried with 3 executors and it still results into the same
problem. So as Sean mentioned, the problem does not seem to be related to
that.
On Sat, Nov 22, 2014 at 11:00 AM, Sean Owen wrote:
> That doesn't seem to be the problem though. It processes bu
That doesn't seem to be the problem though. It processes but then stops.
Presumably there are many executors.
On Nov 22, 2014 9:40 AM, "Akhil Das" wrote:
> For Spark streaming, you must always set *--executor-cores* to a value
> which is >= 2. Or else it will not do any processing.
>
> Thanks
> B
For Spark streaming, you must always set *--executor-cores* to a value
which is >= 2. Or else it will not do any processing.
Thanks
Best Regards
On Sat, Nov 22, 2014 at 8:39 AM, pankaj channe wrote:
> I have seen similar posts on this issue but could not find solution.
> Apologies if this has b