It doesn't.
However, if you have a very large number of keys, with a small number of
very large keys, you can do one of the following:
A. Use a custom partitioner that counts the number of items in a key and
avoids putting large keys together; alternatively, if feasible (and
needed), include part o
river.host")
> allExecutors.filter(! _.split(":")(0).equals(driverHost)).toList
> }
>
>
>
>
> On Friday, August 21, 2015 1:53 PM, Virgil Palanciuc
> wrote:
>
>
> Hi Akhil,
>
> I'm using spark 1.4.1.
> Number of executors is not in the co
unning-executors-td19453.html
>
>
> http://mail-archives.us.apache.org/mod_mbox/spark-user/201411.mbox/%3ccacbyxk+ya1rbbnkwjheekpnbsbh10rykuzt-laqgpdanvhm...@mail.gmail.com%3E
> On Aug 21, 2015 7:42 AM, "Virgil Palanciuc" wrote:
>
>> Is there any reliable way to find out the number of e
Is there any reliable way to find out the number of executors
programatically - regardless of how the job is run? A method that
preferably works for spark-standalone, yarn, mesos, regardless whether the
code runs from the shell or not?
Things that I tried and don't work:
- sparkContext.getExecuto
Hi,
The Spark documentation states that "If accumulators are created with a
name, they will be displayed in Spark’s UI"
http://spark.apache.org/docs/latest/programming-guide.html#accumulators
Where exactly are they shown? I may be dense, but I can't find them on the
UI from http://localhost:4040