It is actually number of cores. If your processor has hyperthreading then
it will be more (number of processors your OS sees)
niedz., 22 mar 2015, 4:51 PM Ted Yu użytkownik
napisał:
> I assume spark.default.parallelism is 4 in the VM Ashish was using.
>
> Cheers
>
I assume spark.default.parallelism is 4 in the VM Ashish was using.
Cheers
2 is added every time the final partition aggregator is called. The result
of summing the elements across partitions is 9 of course. If you force a
single partition (using spark-shell in local mode):
scala> val data = sc.parallelize(List(2,3,4),1)
scala> data.aggregate(0)((x,y) => x+y,(x,y) => 2+x